diff --git a/Isabelle_DOF-Example-I/IsaDofApplications.thy b/Isabelle_DOF-Example-I/IsaDofApplications.thy index cde82e9..914243b 100644 --- a/Isabelle_DOF-Example-I/IsaDofApplications.thy +++ b/Isabelle_DOF-Example-I/IsaDofApplications.thy @@ -184,7 +184,7 @@ scenarios from the point of view of the ontology modeling. In @{text_section (un we discuss the user-interaction generated from the ontological definitions. Finally, we draw conclusions and discuss related work in @{text_section (unchecked) \conclusion\}. \ -section*[bgrnd::text_section,main_author="Some(@{docitem ''bu''}::author)"] +section*[bgrnd::text_section,main_author="Some(@{author ''bu''}::author)"] \ Background: The Isabelle System \ text*[background::introduction, level="Some 1"]\ While Isabelle is widely perceived as an interactive theorem prover for HOL @@ -246,7 +246,7 @@ can be type-checked before being displayed and can be used for calculations befo typeset. When editing, Isabelle's PIDE offers auto-completion and error-messages while typing the above \<^emph>\semi-formal\ content.\ -section*[isadof::technical,main_author="Some(@{docitem ''adb''}::author)"]\ \<^isadof> \ +section*[isadof::technical,main_author="Some(@{author ''adb''}::author)"]\ \<^isadof> \ text\ An \<^isadof> document consists of three components: \<^item> the \<^emph>\ontology definition\ which is an Isabelle theory file with definitions @@ -635,20 +635,20 @@ text\ We present a selection of interaction scenarios @{example \sc and @{example \cenelec_onto\} with Isabelle/PIDE instrumented by \<^isadof>. \ (*<*) -declare_reference*["text-elements"::float] +declare_reference*["text_elements"::float] declare_reference*["hyperlinks"::float] (*>*) subsection*[scholar_pide::example]\ A Scholarly Paper \ -text\ In @{float (unchecked) "text-elements"}~(a) -and @{float (unchecked) "text-elements"}~(b)we show how +text\ In @{float (unchecked) "text_elements"}~(a) +and @{float (unchecked) "text_elements"}~(b)we show how hovering over links permits to explore its meta-information. Clicking on a document class identifier permits to hyperlink into the corresponding class definition (@{float (unchecked) "hyperlinks"}~(a)); hovering over an attribute-definition (which is qualified in order to disambiguate; @{float (unchecked) "hyperlinks"}~(b)). \ -text*["text-elements"::float, +text*["text_elements"::float, main_caption="\Exploring text elements.\"] \ @{fig_content (width=53, height=5, caption="Exploring a reference of a text element.") "figures/Dogfood-II-bgnd1.png" diff --git a/Isabelle_DOF-Example-II/paper.thy b/Isabelle_DOF-Example-II/paper.thy index e54b5f2..9d25e96 100644 --- a/Isabelle_DOF-Example-II/paper.thy +++ b/Isabelle_DOF-Example-II/paper.thy @@ -54,7 +54,7 @@ abstract*[abs, keywordlist="[\Shallow Embedding\,\Process-Alg If you consider citing this paper, please refer to @{cite "HOL-CSP-iFM2020"}. \ text\\ -section*[introheader::introduction,main_author="Some(@{docitem ''bu''}::author)"]\ Introduction \ +section*[introheader::introduction,main_author="Some(@{author ''bu''}::author)"]\ Introduction \ text*[introtext::introduction, level="Some 1"]\ Communicating Sequential Processes (\<^csp>) is a language to specify and verify patterns of interaction of concurrent systems. Together with CCS and LOTOS, it belongs to the family of @@ -126,10 +126,10 @@ attempt to formalize denotational \<^csp> semantics covering a part of Bill Rosc omitted.\}. \ -section*["pre"::tc,main_author="Some(@{author \bu\}::author)"] +section*["pre"::technical,main_author="Some(@{author \bu\}::author)"] \Preliminaries\ -subsection*[cspsemantics::tc, main_author="Some(@{author ''bu''})"]\Denotational \<^csp> Semantics\ +subsection*[cspsemantics::technical, main_author="Some(@{author ''bu''})"]\Denotational \<^csp> Semantics\ text\ The denotational semantics (following @{cite "roscoe:csp:1998"}) comes in three layers: the \<^emph>\trace model\, the \<^emph>\(stable) failures model\ and the \<^emph>\failure/divergence model\. @@ -189,7 +189,7 @@ of @{cite "IsobeRoggenbach2010"} is restricted to a variant of the failures mode \ -subsection*["isabelleHol"::tc, main_author="Some(@{author ''bu''})"]\Isabelle/HOL\ +subsection*["isabelleHol"::technical, main_author="Some(@{author ''bu''})"]\Isabelle/HOL\ text\ Nowadays, Isabelle/HOL is one of the major interactive theory development environments @{cite "nipkow.ea:isabelle:2002"}. HOL stands for Higher-Order Logic, a logic based on simply-typed \\\-calculus extended by parametric polymorphism and Haskell-like type-classes. @@ -218,10 +218,10 @@ domain theory for a particular type-class \\::pcpo\, \<^ie> fixed-point induction and other (automated) proof infrastructure. Isabelle's type-inference can automatically infer, for example, that if \\::pcpo\, then \(\ \ \)::pcpo\. \ -section*["csphol"::tc,main_author="Some(@{author ''bu''}::author)", level="Some 2"] +section*["csphol"::technical,main_author="Some(@{author ''bu''}::author)", level="Some 2"] \Formalising Denotational \<^csp> Semantics in HOL \ -subsection*["processinv"::tc, main_author="Some(@{author ''bu''})"] +subsection*["processinv"::technical, main_author="Some(@{author ''bu''})"] \Process Invariant and Process Type\ text\ First, we need a slight revision of the concept of \<^emph>\trace\: if \\\ is the type of the atomic events (represented by a type variable), then @@ -272,7 +272,7 @@ but this can be constructed in a straight-forward manner. Suitable definitions f \\\, \\\ and \\\ lifting \fst\ and \snd\ on the new \'\ process\-type allows to derive the above properties for any \P::'\ process\. \ -subsection*["operator"::tc, main_author="Some(@{author ''lina''})"] +subsection*["operator"::technical, main_author="Some(@{author ''lina''})"] \\<^csp> Operators over the Process Type\ text\ Now, the operators of \<^csp> \Skip\, \Stop\, \_\_\, \_\_\, \_\_\,\_\_\_\ etc. for internal choice, external choice, prefix and parallel composition, can @@ -297,7 +297,7 @@ The definitional presentation of the \<^csp> process operators according to @{ci follows always this scheme. This part of the theory comprises around 2000 loc. \ -subsection*["orderings"::tc, main_author="Some(@{author ''bu''})"] +subsection*["orderings"::technical, main_author="Some(@{author ''bu''})"] \Refinement Orderings\ text\ \<^csp> is centered around the idea of process refinement; many critical properties, @@ -327,7 +327,7 @@ states, from which no internal progress is possible. \ -subsection*["fixpoint"::tc, main_author="Some(@{author ''lina''})"] +subsection*["fixpoint"::technical, main_author="Some(@{author ''lina''})"] \Process Ordering and HOLCF\ text\ For any denotational semantics, the fixed point theory giving semantics to systems of recursive equations is considered as keystone. Its prerequisite is a complete partial ordering @@ -394,7 +394,7 @@ Fixed-point inductions are the main proof weapon in verifications, together with and the \<^csp> laws. Denotational arguments can be hidden as they are not needed in practical verifications. \ -subsection*["law"::tc, main_author="Some(@{author ''lina''})"] +subsection*["law"::technical, main_author="Some(@{author ''lina''})"] \\<^csp> Rules: Improved Proofs and New Results\ @@ -436,11 +436,11 @@ cases to be considered as well as their complexity makes pen and paper proofs practically infeasible. \ -section*["newResults"::tc,main_author="Some(@{author ''safouan''}::author)", +section*["newResults"::technical,main_author="Some(@{author ''safouan''}::author)", main_author="Some(@{author ''lina''}::author)", level= "Some 3"] \Theoretical Results on Refinement\ text\\ -subsection*["adm"::tc,main_author="Some(@{author ''safouan''}::author)", +subsection*["adm"::technical,main_author="Some(@{author ''safouan''}::author)", main_author="Some(@{author ''lina''}::author)"] \Decomposition Rules\ text\ @@ -476,7 +476,7 @@ The failure and divergence projections of this operator are also interdependent, sequence operator. Hence, this operator is not monotonic with \\\<^sub>\\, \\\<^sub>\\ and \\\<^sub>\\, but monotonic when their combinations are considered. \ -subsection*["processes"::tc,main_author="Some(@{author ''safouan''}::author)", +subsection*["processes"::technical,main_author="Some(@{author ''safouan''}::author)", main_author="Some(@{author ''lina''}::author)"] \Reference Processes and their Properties\ text\ @@ -597,7 +597,7 @@ then it may still be livelock-free. % This makes sense since livelocks are worse \ -section*["advanced"::tc,main_author="Some(@{author ''safouan''}::author)",level="Some 3"] +section*["advanced"::technical,main_author="Some(@{author ''safouan''}::author)",level="Some 3"] \Advanced Verification Techniques\ text\ @@ -612,7 +612,7 @@ verification. In the latter case, we present an approach to a verification of a architecture, in this case a ring-structure of arbitrary size. \ -subsection*["illustration"::tc,main_author="Some(@{author ''safouan''}::author)", level="Some 3"] +subsection*["illustration"::technical,main_author="Some(@{author ''safouan''}::author)", level="Some 3"] \The General CopyBuffer Example\ text\ We consider the paradigmatic copy buffer example @{cite "Hoare:1985:CSP:3921" and "Roscoe:UCS:2010"} @@ -677,7 +677,7 @@ corollary deadlock_free COPY \ -subsection*["inductions"::tc,main_author="Some(@{author ''safouan''}::author)"] +subsection*["inductions"::technical,main_author="Some(@{author ''safouan''}::author)"] \New Fixed-Point Inductions\ text\ @@ -727,7 +727,7 @@ The astute reader may notice here that if the induction step is weakened (having the base steps require enforcement. \ -subsection*["norm"::tc,main_author="Some(@{author ''safouan''}::author)"] +subsection*["norm"::technical,main_author="Some(@{author ''safouan''}::author)"] \Normalization\ text\ Our framework can reason not only over infinite alphabets, but also over processes parameterized @@ -787,7 +787,7 @@ Summing up, our method consists of four stages: \ -subsection*["dining_philosophers"::tc,main_author="Some(@{author ''safouan''}::author)",level="Some 3"] +subsection*["dining_philosophers"::technical,main_author="Some(@{author ''safouan''}::author)",level="Some 3"] \Generalized Dining Philosophers\ text\ The dining philosophers problem is another paradigmatic example in the \<^csp> literature @@ -879,7 +879,7 @@ for a dozen of philosophers (on a usual machine) due to the exponential combinat Furthermore, our proof is fairly stable against modifications like adding non synchronized events like thinking or sitting down in contrast to model-checking techniques. \ -section*["relatedwork"::tc,main_author="Some(@{author ''lina''}::author)",level="Some 3"] +section*["relatedwork"::technical,main_author="Some(@{author ''lina''}::author)",level="Some 3"] \Related work\ text\ diff --git a/Isabelle_DOF-Examples-Extra/CENELEC_50128/mini_odo/mini_odo.thy b/Isabelle_DOF-Examples-Extra/CENELEC_50128/mini_odo/mini_odo.thy index 3c2cb4d..e36acc4 100644 --- a/Isabelle_DOF-Examples-Extra/CENELEC_50128/mini_odo/mini_odo.thy +++ b/Isabelle_DOF-Examples-Extra/CENELEC_50128/mini_odo/mini_odo.thy @@ -102,13 +102,13 @@ text\ functioning of the system and for its integration into the system as a whole. In particular, we need to make the following assumptions explicit: \<^vs>\-0.3cm\\ -text*["perfect-wheel"::assumption] +text*["perfect_wheel"::assumption] \\<^item> the wheel is perfectly circular with a given, constant radius. \<^vs>\-0.3cm\\ -text*["no-slip"::assumption] +text*["no_slip"::assumption] \\<^item> the slip between the trains wheel and the track negligible. \<^vs>\-0.3cm\\ -text*["constant-teeth-dist"::assumption] +text*["constant_teeth_dist"::assumption] \\<^item> the distance between all teeth of a wheel is the same and constant, and \<^vs>\-0.3cm\\ -text*["constant-sampling-rate"::assumption] +text*["constant_sampling_rate"::assumption] \\<^item> the sampling rate of positions is a given constant.\ text\ @@ -126,13 +126,13 @@ text\ subsection\Capturing ``System Architecture.''\ -figure*["three-phase"::figure,relative_width="70",file_src="''figures/three-phase-odo.pdf''"] +figure*["three_phase"::figure,relative_width="70",file_src="''figures/three-phase-odo.pdf''"] \An odometer with three sensors \C1\, \C2\, and \C3\.\ text\ The requirements analysis also contains a document \<^doc_class>\SYSAD\ (\<^typ>\system_architecture_description\) that contains technical drawing of the odometer, - a timing diagram (see \<^figure>\three-phase\), and tables describing the encoding of the position + a timing diagram (see \<^figure>\three_phase\), and tables describing the encoding of the position for the possible signal transitions of the sensors \C1\, \C2\, and \C3\. \ @@ -146,7 +146,7 @@ text\ sub-system configuration. \ (*<*) -declare_reference*["df-numerics-encshaft"::figure] +declare_reference*["df_numerics_encshaft"::figure] (*>*) subsection\Capturing ``Required Performances.''\ text\ @@ -160,9 +160,9 @@ text\ The requirement analysis document describes the physical environment, the architecture of the measuring device, and the required format and precision of the measurements of the odometry - function as represented (see @{figure (unchecked) "df-numerics-encshaft"}).\ + function as represented (see @{figure (unchecked) "df_numerics_encshaft"}).\ -figure*["df-numerics-encshaft"::figure,relative_width="76",file_src="''figures/df-numerics-encshaft.png''"] +figure*["df_numerics_encshaft"::figure,relative_width="76",file_src="''figures/df-numerics-encshaft.png''"] \Real distance vs. discrete distance vs. shaft-encoder sequence\ @@ -215,7 +215,7 @@ text\ concepts such as Cauchy Sequences, limits, differentiability, and a very substantial part of classical Calculus. \SOME\ is the Hilbert choice operator from HOL; the definitions of the model parameters admit all possible positive values as uninterpreted constants. Our - \<^assumption>\perfect-wheel\ is translated into a calculation of the circumference of the + \<^assumption>\perfect_wheel\ is translated into a calculation of the circumference of the wheel, while \\s\<^sub>r\<^sub>e\<^sub>s\, the resolution of the odometer, can be calculated from the these parameters. HOL-Analysis permits to formalize the fundamental physical observables: \ diff --git a/Isabelle_DOF-Examples-Extra/beamerx/poster/poster.thy b/Isabelle_DOF-Examples-Extra/beamerx/poster/poster.thy index 78622c7..a795687 100644 --- a/Isabelle_DOF-Examples-Extra/beamerx/poster/poster.thy +++ b/Isabelle_DOF-Examples-Extra/beamerx/poster/poster.thy @@ -30,6 +30,10 @@ text\ \vfill \ +text\ +@{block (title = "\Title\<^sub>t\<^sub>e\<^sub>s\<^sub>t\") "\Block content\<^sub>t\<^sub>e\<^sub>s\<^sub>t\"} +\ + (*<*) end (*>*) diff --git a/Isabelle_DOF-Examples-Extra/technical_report/TR_my_commented_isabelle/TR_MyCommentedIsabelle.thy b/Isabelle_DOF-Examples-Extra/technical_report/TR_my_commented_isabelle/TR_MyCommentedIsabelle.thy index e1c9749..d11a7ed 100644 --- a/Isabelle_DOF-Examples-Extra/technical_report/TR_my_commented_isabelle/TR_MyCommentedIsabelle.thy +++ b/Isabelle_DOF-Examples-Extra/technical_report/TR_my_commented_isabelle/TR_MyCommentedIsabelle.thy @@ -807,7 +807,7 @@ text\ They reflect the Pure logic depicted in a number of presentations s Notated as logical inference rules, these operations were presented as follows: \ -text*["text-elements"::float, +text*["text_elements"::float, main_caption="\Kernel Inference Rules.\"] \ @{fig_content (width=48, caption="Pure Kernel Inference Rules I.") "figures/pure-inferences-I.pdf" diff --git a/Isabelle_DOF-Ontologies/Conceptual/Conceptual.thy b/Isabelle_DOF-Ontologies/Conceptual/Conceptual.thy index 75bb658..c3ed081 100644 --- a/Isabelle_DOF-Ontologies/Conceptual/Conceptual.thy +++ b/Isabelle_DOF-Ontologies/Conceptual/Conceptual.thy @@ -131,7 +131,7 @@ type_synonym XX = B section\Examples of inheritance \ -doc_class C = XX + +doc_class C = B + z :: "A option" <= None (* A LINK, i.e. an attribute that has a type referring to a document class. Mathematical relations over document items can be modeled. *) diff --git a/Isabelle_DOF-Unit-Tests/Attributes.thy b/Isabelle_DOF-Unit-Tests/Attributes.thy index 2dab8f1..14ba9f3 100644 --- a/Isabelle_DOF-Unit-Tests/Attributes.thy +++ b/Isabelle_DOF-Unit-Tests/Attributes.thy @@ -160,7 +160,7 @@ ML\ @{docitem_attribute a2::omega}; type_synonym ALFACENTAURI = E -update_instance*[omega::ALFACENTAURI, x+="''inition''"] +update_instance*[omega::E, x+="''inition''"] ML\ val s = HOLogic.dest_string ( @{docitem_attribute x::omega}) \ diff --git a/Isabelle_DOF-Unit-Tests/Concept_OntoReferencing.thy b/Isabelle_DOF-Unit-Tests/Concept_OntoReferencing.thy index 16ab498..6d3ad8c 100644 --- a/Isabelle_DOF-Unit-Tests/Concept_OntoReferencing.thy +++ b/Isabelle_DOF-Unit-Tests/Concept_OntoReferencing.thy @@ -144,8 +144,8 @@ update_instance*[f::F,r:="[@{thm ''Concept_OntoReferencing.some_proof''}]"] text\ ..., mauris amet, id elit aliquam aptent id, ... @{docitem \a\} \ (*>*) text\Here we add and maintain a link that is actually modeled as m-to-n relation ...\ -update_instance*[f::F,b:="{(@{docitem \a\}::A,@{docitem \c1\}::C), - (@{docitem \a\}, @{docitem \c2\})}"] +update_instance*[f::F,b:="{(@{A \a\}::A,@{C \c1\}::C), + (@{A \a\}, @{C \c2\})}"] section\Closing the Monitor and testing the Results.\ diff --git a/Isabelle_DOF-Unit-Tests/Concept_TermAntiquotations.thy b/Isabelle_DOF-Unit-Tests/Concept_TermAntiquotations.thy index 2f1933f..ebd64c3 100644 --- a/Isabelle_DOF-Unit-Tests/Concept_TermAntiquotations.thy +++ b/Isabelle_DOF-Unit-Tests/Concept_TermAntiquotations.thy @@ -116,20 +116,22 @@ section\Putting everything together\ text\Major sample: test-item of doc-class \F\ with a relational link between class instances, and links to formal Isabelle items like \typ\, \term\ and \thm\. \ +declare[[ML_print_depth = 10000]] text*[xcv4::F, r="[@{thm ''HOL.refl''}, @{thm \Concept_TermAntiquotations.local_sample_lemma\}]", (* long names required *) - b="{(@{docitem ''xcv1''},@{docitem \xcv2\})}", (* notations \...\ vs. ''...'' *) + b="{(@{A ''xcv1''},@{C \xcv2\})}", (* notations \...\ vs. ''...'' *) s="[@{typ \int list\}]", properties = "[@{term \H \ H\}]" (* notation \...\ required for UTF8*) ]\Lorem ipsum ...\ - +declare[[ML_print_depth = 20]] text*[xcv5::G, g="@{thm \HOL.sym\}"]\Lorem ipsum ...\ text\... and here we add a relation between @{docitem \xcv3\} and @{docitem \xcv2\} into the relation \verb+b+ of @{docitem \xcv5\}. Note that in the link-relation, -a @{typ "C"}-type is required, but a @{typ "G"}-type is offered which is legal in -\verb+Isa_DOF+ because of the sub-class relation between those classes: \ -update_instance*[xcv4::F, b+="{(@{docitem ''xcv3''},@{docitem ''xcv5''})}"] +a @{typ "C"}-type is required, so if a @{typ "G"}-type is offered, it is considered illegal +in \verb+Isa_DOF+ despite the sub-class relation between those classes: \ +update_instance-assert-error[xcv4::F, b+="{(@{docitem ''xcv3''},@{docitem ''xcv5''})}"] +\Type unification failed\ text\And here is the results of some ML-term antiquotations:\ ML\ @{docitem_attribute b::xcv4} \ diff --git a/Isabelle_DOF-Unit-Tests/Concept_TermEvaluation.thy b/Isabelle_DOF-Unit-Tests/Concept_TermEvaluation.thy index c23203b..17e8b9d 100644 --- a/Isabelle_DOF-Unit-Tests/Concept_TermEvaluation.thy +++ b/Isabelle_DOF-Unit-Tests/Concept_TermEvaluation.thy @@ -187,7 +187,7 @@ to update the instance @{docitem \xcv4\}: \ update_instance-assert-error[xcv4::F, b+="{(@{A ''xcv3''},@{G ''xcv5''})}"] - \type of attribute: Conceptual.F.b does not fit to term\ + \Type unification failed: Clash of types\ section\\<^theory_text>\assert*\-Annotated assertion-commands\ @@ -225,11 +225,11 @@ text\... and here we reference @{A \assertionA\}.\ (*>*) assert*\evidence @{result \resultProof\} = evidence @{result \resultProof2\}\ -text\The optional evaluator of \value*\ and \assert*\ must be specified after the meta arguments:\ -value* [optional_test_A::A, x=6] [nbe] \filter (\\. A.x \ > 5) @{A_instances}\ +text\The optional evaluator of \value*\ and \assert*\ must be specified before the meta arguments:\ +value* [nbe] [optional_test_A::A, x=6] \filter (\\. A.x \ > 5) @{A_instances}\ -assert* [resultProof3::result, evidence = "proof", property="[@{thm \HOL.sym\}]"] [nbe] - \evidence @{result \resultProof3\} = evidence @{result \resultProof2\}\ +assert* [nbe] [resultProof3::result, evidence = "proof", property="[@{thm \HOL.sym\}]"] + \evidence @{result \resultProof3\} = evidence @{result \resultProof2\}\ text\ The evaluation of @{command "assert*"} can be disabled diff --git a/Isabelle_DOF-Unit-Tests/Latex_Tests.thy b/Isabelle_DOF-Unit-Tests/Latex_Tests.thy index 4a31928..7210ca5 100644 --- a/Isabelle_DOF-Unit-Tests/Latex_Tests.thy +++ b/Isabelle_DOF-Unit-Tests/Latex_Tests.thy @@ -390,11 +390,11 @@ text-latex\ ML\ fun gen_enriched_document_command3 name {body} cid_transform attr_transform markdown - (((((oid,pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t, + ((((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t, xstring_opt:(xstring * Position.T) option), toks:Input.source list) = gen_enriched_document_command2 name {body=body} cid_transform attr_transform markdown - (((((oid,pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t, + ((((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t, xstring_opt:(xstring * Position.T) option), toks) \ \Hack : drop second and thrd args.\ diff --git a/Isabelle_DOF-Unit-Tests/OutOfOrderPresntn.thy b/Isabelle_DOF-Unit-Tests/OutOfOrderPresntn.thy index 511466a..c4dc9a5 100644 --- a/Isabelle_DOF-Unit-Tests/OutOfOrderPresntn.thy +++ b/Isabelle_DOF-Unit-Tests/OutOfOrderPresntn.thy @@ -382,11 +382,11 @@ text-latex\ ML\ fun gen_enriched_document_command3 name {body} cid_transform attr_transform markdown - (((((oid,pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t, + ((((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t, xstring_opt:(xstring * Position.T) option), toks:Input.source list) = gen_enriched_document_command2 name {body=body} cid_transform attr_transform markdown - (((((oid,pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t, + ((((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t, xstring_opt:(xstring * Position.T) option), toks) \ \Hack : drop second and thrd args.\ diff --git a/Isabelle_DOF-Unit-Tests/ROOT b/Isabelle_DOF-Unit-Tests/ROOT index 9fb9052..78d731e 100644 --- a/Isabelle_DOF-Unit-Tests/ROOT +++ b/Isabelle_DOF-Unit-Tests/ROOT @@ -16,6 +16,7 @@ session "Isabelle_DOF-Unit-Tests" = "Isabelle_DOF-Ontologies" + "Cenelec_Test" "OutOfOrderPresntn" "COL_Test" + "Test_Polymorphic_Classes" document_files "root.bib" "figures/A.png" diff --git a/Isabelle_DOF-Unit-Tests/Test.thy b/Isabelle_DOF-Unit-Tests/Test.thy deleted file mode 100644 index 3feb083..0000000 --- a/Isabelle_DOF-Unit-Tests/Test.thy +++ /dev/null @@ -1,463 +0,0 @@ -(* Title: HOL/Record.thy - Author: Wolfgang Naraschewski, TU Muenchen - Author: Markus Wenzel, TU Muenchen - Author: Norbert Schirmer, TU Muenchen - Author: Thomas Sewell, NICTA - Author: Florian Haftmann, TU Muenchen -*) - -section \Extensible records with structural subtyping\ - -theory Test -imports HOL.Quickcheck_Exhaustive -keywords - "record*" :: thy_defn and - "print_record*" :: diag -begin - -subsection \Introduction\ - -text \ - Records are isomorphic to compound tuple types. To implement - efficient records, we make this isomorphism explicit. Consider the - record access/update simplification \alpha (beta_update f - rec) = alpha rec\ for distinct fields alpha and beta of some record - rec with n fields. There are \n ^ 2\ such theorems, which - prohibits storage of all of them for large n. The rules can be - proved on the fly by case decomposition and simplification in O(n) - time. By creating O(n) isomorphic-tuple types while defining the - record, however, we can prove the access/update simplification in - \O(log(n)^2)\ time. - - The O(n) cost of case decomposition is not because O(n) steps are - taken, but rather because the resulting rule must contain O(n) new - variables and an O(n) size concrete record construction. To sidestep - this cost, we would like to avoid case decomposition in proving - access/update theorems. - - Record types are defined as isomorphic to tuple types. For instance, - a record type with fields \'a\, \'b\, \'c\ - and \'d\ might be introduced as isomorphic to \'a \ - ('b \ ('c \ 'd))\. If we balance the tuple tree to \('a \ - 'b) \ ('c \ 'd)\ then accessors can be defined by converting to the - underlying type then using O(log(n)) fst or snd operations. - Updators can be defined similarly, if we introduce a \fst_update\ and \snd_update\ function. Furthermore, we can - prove the access/update theorem in O(log(n)) steps by using simple - rewrites on fst, snd, \fst_update\ and \snd_update\. - - The catch is that, although O(log(n)) steps were taken, the - underlying type we converted to is a tuple tree of size - O(n). Processing this term type wastes performance. We avoid this - for large n by taking each subtree of size K and defining a new type - isomorphic to that tuple subtree. A record can now be defined as - isomorphic to a tuple tree of these O(n/K) new types, or, if \n > K*K\, we can repeat the process, until the record can be - defined in terms of a tuple tree of complexity less than the - constant K. - - If we prove the access/update theorem on this type with the - analogous steps to the tuple tree, we consume \O(log(n)^2)\ - time as the intermediate terms are \O(log(n))\ in size and - the types needed have size bounded by K. To enable this analogous - traversal, we define the functions seen below: \iso_tuple_fst\, \iso_tuple_snd\, \iso_tuple_fst_update\ - and \iso_tuple_snd_update\. These functions generalise tuple - operations by taking a parameter that encapsulates a tuple - isomorphism. The rewrites needed on these functions now need an - additional assumption which is that the isomorphism works. - - These rewrites are typically used in a structured way. They are here - presented as the introduction rule \isomorphic_tuple.intros\ - rather than as a rewrite rule set. The introduction form is an - optimisation, as net matching can be performed at one term location - for each step rather than the simplifier searching the term for - possible pattern matches. The rule set is used as it is viewed - outside the locale, with the locale assumption (that the isomorphism - is valid) left as a rule assumption. All rules are structured to aid - net matching, using either a point-free form or an encapsulating - predicate. -\ - -subsection \Operators and lemmas for types isomorphic to tuples\ - -datatype (dead 'a, dead 'b, dead 'c) tuple_isomorphism = - Tuple_Isomorphism "'a \ 'b \ 'c" "'b \ 'c \ 'a" - -primrec - repr :: "('a, 'b, 'c) tuple_isomorphism \ 'a \ 'b \ 'c" where - "repr (Tuple_Isomorphism r a) = r" - -primrec - abst :: "('a, 'b, 'c) tuple_isomorphism \ 'b \ 'c \ 'a" where - "abst (Tuple_Isomorphism r a) = a" - -definition - iso_tuple_fst :: "('a, 'b, 'c) tuple_isomorphism \ 'a \ 'b" where - "iso_tuple_fst isom = fst \ repr isom" - -definition - iso_tuple_snd :: "('a, 'b, 'c) tuple_isomorphism \ 'a \ 'c" where - "iso_tuple_snd isom = snd \ repr isom" - -definition - iso_tuple_fst_update :: - "('a, 'b, 'c) tuple_isomorphism \ ('b \ 'b) \ ('a \ 'a)" where - "iso_tuple_fst_update isom f = abst isom \ apfst f \ repr isom" - -definition - iso_tuple_snd_update :: - "('a, 'b, 'c) tuple_isomorphism \ ('c \ 'c) \ ('a \ 'a)" where - "iso_tuple_snd_update isom f = abst isom \ apsnd f \ repr isom" - -definition - iso_tuple_cons :: - "('a, 'b, 'c) tuple_isomorphism \ 'b \ 'c \ 'a" where - "iso_tuple_cons isom = curry (abst isom)" - - -subsection \Logical infrastructure for records\ - -definition - iso_tuple_surjective_proof_assist :: "'a \ 'b \ ('a \ 'b) \ bool" where - "iso_tuple_surjective_proof_assist x y f \ f x = y" - -definition - iso_tuple_update_accessor_cong_assist :: - "(('b \ 'b) \ ('a \ 'a)) \ ('a \ 'b) \ bool" where - "iso_tuple_update_accessor_cong_assist upd ac \ - (\f v. upd (\x. f (ac v)) v = upd f v) \ (\v. upd id v = v)" - -definition - iso_tuple_update_accessor_eq_assist :: - "(('b \ 'b) \ ('a \ 'a)) \ ('a \ 'b) \ 'a \ ('b \ 'b) \ 'a \ 'b \ bool" where - "iso_tuple_update_accessor_eq_assist upd ac v f v' x \ - upd f v = v' \ ac v = x \ iso_tuple_update_accessor_cong_assist upd ac" - -lemma update_accessor_congruence_foldE: - assumes uac: "iso_tuple_update_accessor_cong_assist upd ac" - and r: "r = r'" and v: "ac r' = v'" - and f: "\v. v' = v \ f v = f' v" - shows "upd f r = upd f' r'" - using uac r v [symmetric] - apply (subgoal_tac "upd (\x. f (ac r')) r' = upd (\x. f' (ac r')) r'") - apply (simp add: iso_tuple_update_accessor_cong_assist_def) - apply (simp add: f) - done - -lemma update_accessor_congruence_unfoldE: - "iso_tuple_update_accessor_cong_assist upd ac \ - r = r' \ ac r' = v' \ (\v. v = v' \ f v = f' v) \ - upd f r = upd f' r'" - apply (erule(2) update_accessor_congruence_foldE) - apply simp - done - -lemma iso_tuple_update_accessor_cong_assist_id: - "iso_tuple_update_accessor_cong_assist upd ac \ upd id = id" - by rule (simp add: iso_tuple_update_accessor_cong_assist_def) - -lemma update_accessor_noopE: - assumes uac: "iso_tuple_update_accessor_cong_assist upd ac" - and ac: "f (ac x) = ac x" - shows "upd f x = x" - using uac - by (simp add: ac iso_tuple_update_accessor_cong_assist_id [OF uac, unfolded id_def] - cong: update_accessor_congruence_unfoldE [OF uac]) - -lemma update_accessor_noop_compE: - assumes uac: "iso_tuple_update_accessor_cong_assist upd ac" - and ac: "f (ac x) = ac x" - shows "upd (g \ f) x = upd g x" - by (simp add: ac cong: update_accessor_congruence_unfoldE[OF uac]) - -lemma update_accessor_cong_assist_idI: - "iso_tuple_update_accessor_cong_assist id id" - by (simp add: iso_tuple_update_accessor_cong_assist_def) - -lemma update_accessor_cong_assist_triv: - "iso_tuple_update_accessor_cong_assist upd ac \ - iso_tuple_update_accessor_cong_assist upd ac" - by assumption - -lemma update_accessor_accessor_eqE: - "iso_tuple_update_accessor_eq_assist upd ac v f v' x \ ac v = x" - by (simp add: iso_tuple_update_accessor_eq_assist_def) - -lemma update_accessor_updator_eqE: - "iso_tuple_update_accessor_eq_assist upd ac v f v' x \ upd f v = v'" - by (simp add: iso_tuple_update_accessor_eq_assist_def) - -lemma iso_tuple_update_accessor_eq_assist_idI: - "v' = f v \ iso_tuple_update_accessor_eq_assist id id v f v' v" - by (simp add: iso_tuple_update_accessor_eq_assist_def update_accessor_cong_assist_idI) - -lemma iso_tuple_update_accessor_eq_assist_triv: - "iso_tuple_update_accessor_eq_assist upd ac v f v' x \ - iso_tuple_update_accessor_eq_assist upd ac v f v' x" - by assumption - -lemma iso_tuple_update_accessor_cong_from_eq: - "iso_tuple_update_accessor_eq_assist upd ac v f v' x \ - iso_tuple_update_accessor_cong_assist upd ac" - by (simp add: iso_tuple_update_accessor_eq_assist_def) - -lemma iso_tuple_surjective_proof_assistI: - "f x = y \ iso_tuple_surjective_proof_assist x y f" - by (simp add: iso_tuple_surjective_proof_assist_def) - -lemma iso_tuple_surjective_proof_assist_idE: - "iso_tuple_surjective_proof_assist x y id \ x = y" - by (simp add: iso_tuple_surjective_proof_assist_def) - -locale isomorphic_tuple = - fixes isom :: "('a, 'b, 'c) tuple_isomorphism" - assumes repr_inv: "\x. abst isom (repr isom x) = x" - and abst_inv: "\y. repr isom (abst isom y) = y" -begin - -lemma repr_inj: "repr isom x = repr isom y \ x = y" - by (auto dest: arg_cong [of "repr isom x" "repr isom y" "abst isom"] - simp add: repr_inv) - -lemma abst_inj: "abst isom x = abst isom y \ x = y" - by (auto dest: arg_cong [of "abst isom x" "abst isom y" "repr isom"] - simp add: abst_inv) - -lemmas simps = Let_def repr_inv abst_inv repr_inj abst_inj - -lemma iso_tuple_access_update_fst_fst: - "f \ h g = j \ f \ - (f \ iso_tuple_fst isom) \ (iso_tuple_fst_update isom \ h) g = - j \ (f \ iso_tuple_fst isom)" - by (clarsimp simp: iso_tuple_fst_update_def iso_tuple_fst_def simps - fun_eq_iff) - -lemma iso_tuple_access_update_snd_snd: - "f \ h g = j \ f \ - (f \ iso_tuple_snd isom) \ (iso_tuple_snd_update isom \ h) g = - j \ (f \ iso_tuple_snd isom)" - by (clarsimp simp: iso_tuple_snd_update_def iso_tuple_snd_def simps - fun_eq_iff) - -lemma iso_tuple_access_update_fst_snd: - "(f \ iso_tuple_fst isom) \ (iso_tuple_snd_update isom \ h) g = - id \ (f \ iso_tuple_fst isom)" - by (clarsimp simp: iso_tuple_snd_update_def iso_tuple_fst_def simps - fun_eq_iff) - -lemma iso_tuple_access_update_snd_fst: - "(f \ iso_tuple_snd isom) \ (iso_tuple_fst_update isom \ h) g = - id \ (f \ iso_tuple_snd isom)" - by (clarsimp simp: iso_tuple_fst_update_def iso_tuple_snd_def simps - fun_eq_iff) - -lemma iso_tuple_update_swap_fst_fst: - "h f \ j g = j g \ h f \ - (iso_tuple_fst_update isom \ h) f \ (iso_tuple_fst_update isom \ j) g = - (iso_tuple_fst_update isom \ j) g \ (iso_tuple_fst_update isom \ h) f" - by (clarsimp simp: iso_tuple_fst_update_def simps apfst_compose fun_eq_iff) - -lemma iso_tuple_update_swap_snd_snd: - "h f \ j g = j g \ h f \ - (iso_tuple_snd_update isom \ h) f \ (iso_tuple_snd_update isom \ j) g = - (iso_tuple_snd_update isom \ j) g \ (iso_tuple_snd_update isom \ h) f" - by (clarsimp simp: iso_tuple_snd_update_def simps apsnd_compose fun_eq_iff) - -lemma iso_tuple_update_swap_fst_snd: - "(iso_tuple_snd_update isom \ h) f \ (iso_tuple_fst_update isom \ j) g = - (iso_tuple_fst_update isom \ j) g \ (iso_tuple_snd_update isom \ h) f" - by (clarsimp simp: iso_tuple_fst_update_def iso_tuple_snd_update_def - simps fun_eq_iff) - -lemma iso_tuple_update_swap_snd_fst: - "(iso_tuple_fst_update isom \ h) f \ (iso_tuple_snd_update isom \ j) g = - (iso_tuple_snd_update isom \ j) g \ (iso_tuple_fst_update isom \ h) f" - by (clarsimp simp: iso_tuple_fst_update_def iso_tuple_snd_update_def simps - fun_eq_iff) - -lemma iso_tuple_update_compose_fst_fst: - "h f \ j g = k (f \ g) \ - (iso_tuple_fst_update isom \ h) f \ (iso_tuple_fst_update isom \ j) g = - (iso_tuple_fst_update isom \ k) (f \ g)" - by (clarsimp simp: iso_tuple_fst_update_def simps apfst_compose fun_eq_iff) - -lemma iso_tuple_update_compose_snd_snd: - "h f \ j g = k (f \ g) \ - (iso_tuple_snd_update isom \ h) f \ (iso_tuple_snd_update isom \ j) g = - (iso_tuple_snd_update isom \ k) (f \ g)" - by (clarsimp simp: iso_tuple_snd_update_def simps apsnd_compose fun_eq_iff) - -lemma iso_tuple_surjective_proof_assist_step: - "iso_tuple_surjective_proof_assist v a (iso_tuple_fst isom \ f) \ - iso_tuple_surjective_proof_assist v b (iso_tuple_snd isom \ f) \ - iso_tuple_surjective_proof_assist v (iso_tuple_cons isom a b) f" - by (clarsimp simp: iso_tuple_surjective_proof_assist_def simps - iso_tuple_fst_def iso_tuple_snd_def iso_tuple_cons_def) - -lemma iso_tuple_fst_update_accessor_cong_assist: - assumes "iso_tuple_update_accessor_cong_assist f g" - shows "iso_tuple_update_accessor_cong_assist - (iso_tuple_fst_update isom \ f) (g \ iso_tuple_fst isom)" -proof - - from assms have "f id = id" - by (rule iso_tuple_update_accessor_cong_assist_id) - with assms show ?thesis - by (clarsimp simp: iso_tuple_update_accessor_cong_assist_def simps - iso_tuple_fst_update_def iso_tuple_fst_def) -qed - -lemma iso_tuple_snd_update_accessor_cong_assist: - assumes "iso_tuple_update_accessor_cong_assist f g" - shows "iso_tuple_update_accessor_cong_assist - (iso_tuple_snd_update isom \ f) (g \ iso_tuple_snd isom)" -proof - - from assms have "f id = id" - by (rule iso_tuple_update_accessor_cong_assist_id) - with assms show ?thesis - by (clarsimp simp: iso_tuple_update_accessor_cong_assist_def simps - iso_tuple_snd_update_def iso_tuple_snd_def) -qed - -lemma iso_tuple_fst_update_accessor_eq_assist: - assumes "iso_tuple_update_accessor_eq_assist f g a u a' v" - shows "iso_tuple_update_accessor_eq_assist - (iso_tuple_fst_update isom \ f) (g \ iso_tuple_fst isom) - (iso_tuple_cons isom a b) u (iso_tuple_cons isom a' b) v" -proof - - from assms have "f id = id" - by (auto simp add: iso_tuple_update_accessor_eq_assist_def - intro: iso_tuple_update_accessor_cong_assist_id) - with assms show ?thesis - by (clarsimp simp: iso_tuple_update_accessor_eq_assist_def - iso_tuple_fst_update_def iso_tuple_fst_def - iso_tuple_update_accessor_cong_assist_def iso_tuple_cons_def simps) -qed - -lemma iso_tuple_snd_update_accessor_eq_assist: - assumes "iso_tuple_update_accessor_eq_assist f g b u b' v" - shows "iso_tuple_update_accessor_eq_assist - (iso_tuple_snd_update isom \ f) (g \ iso_tuple_snd isom) - (iso_tuple_cons isom a b) u (iso_tuple_cons isom a b') v" -proof - - from assms have "f id = id" - by (auto simp add: iso_tuple_update_accessor_eq_assist_def - intro: iso_tuple_update_accessor_cong_assist_id) - with assms show ?thesis - by (clarsimp simp: iso_tuple_update_accessor_eq_assist_def - iso_tuple_snd_update_def iso_tuple_snd_def - iso_tuple_update_accessor_cong_assist_def iso_tuple_cons_def simps) -qed - -lemma iso_tuple_cons_conj_eqI: - "a = c \ b = d \ P \ Q \ - iso_tuple_cons isom a b = iso_tuple_cons isom c d \ P \ Q" - by (clarsimp simp: iso_tuple_cons_def simps) - -lemmas intros = - iso_tuple_access_update_fst_fst - iso_tuple_access_update_snd_snd - iso_tuple_access_update_fst_snd - iso_tuple_access_update_snd_fst - iso_tuple_update_swap_fst_fst - iso_tuple_update_swap_snd_snd - iso_tuple_update_swap_fst_snd - iso_tuple_update_swap_snd_fst - iso_tuple_update_compose_fst_fst - iso_tuple_update_compose_snd_snd - iso_tuple_surjective_proof_assist_step - iso_tuple_fst_update_accessor_eq_assist - iso_tuple_snd_update_accessor_eq_assist - iso_tuple_fst_update_accessor_cong_assist - iso_tuple_snd_update_accessor_cong_assist - iso_tuple_cons_conj_eqI - -end - -lemma isomorphic_tuple_intro: - fixes repr abst - assumes repr_inj: "\x y. repr x = repr y \ x = y" - and abst_inv: "\z. repr (abst z) = z" - and v: "v \ Tuple_Isomorphism repr abst" - shows "isomorphic_tuple v" -proof - fix x have "repr (abst (repr x)) = repr x" - by (simp add: abst_inv) - then show "Test.abst v (Test.repr v x) = x" - by (simp add: v repr_inj) -next - fix y - show "Test.repr v (Test.abst v y) = y" - by (simp add: v) (fact abst_inv) -qed - -definition - "tuple_iso_tuple \ Tuple_Isomorphism id id" - -lemma tuple_iso_tuple: - "isomorphic_tuple tuple_iso_tuple" - by (simp add: isomorphic_tuple_intro [OF _ _ reflexive] tuple_iso_tuple_def) - -lemma refl_conj_eq: "Q = R \ P \ Q \ P \ R" - by simp - -lemma iso_tuple_UNIV_I: "x \ UNIV \ True" - by simp - -lemma iso_tuple_True_simp: "(True \ PROP P) \ PROP P" - by simp - -lemma prop_subst: "s = t \ PROP P t \ PROP P s" - by simp - -lemma K_record_comp: "(\x. c) \ f = (\x. c)" - by (simp add: comp_def) - - -subsection \Concrete record syntax\ - -nonterminal - ident and - field_type and - field_types and - field and - fields and - field_update and - field_updates - -syntax - "_constify" :: "id => ident" ("_") - "_constify" :: "longid => ident" ("_") - - "_field_type" :: "ident => type => field_type" ("(2_ ::/ _)") - "" :: "field_type => field_types" ("_") - "_field_types" :: "field_type => field_types => field_types" ("_,/ _") - "_record_type" :: "field_types => type" ("(3\_\)") - "_record_type_scheme" :: "field_types => type => type" ("(3\_,/ (2\ ::/ _)\)") - - "_field" :: "ident => 'a => field" ("(2_ =/ _)") - "" :: "field => fields" ("_") - "_fields" :: "field => fields => fields" ("_,/ _") - "_record" :: "fields => 'a" ("(3\_\)") - "_record_scheme" :: "fields => 'a => 'a" ("(3\_,/ (2\ =/ _)\)") - - "_field_update" :: "ident => 'a => field_update" ("(2_ :=/ _)") - "" :: "field_update => field_updates" ("_") - "_field_updates" :: "field_update => field_updates => field_updates" ("_,/ _") - "_record_update" :: "'a => field_updates => 'b" ("_/(3\_\)" [900, 0] 900) - -syntax (ASCII) - "_record_type" :: "field_types => type" ("(3'(| _ |'))") - "_record_type_scheme" :: "field_types => type => type" ("(3'(| _,/ (2... ::/ _) |'))") - "_record" :: "fields => 'a" ("(3'(| _ |'))") - "_record_scheme" :: "fields => 'a => 'a" ("(3'(| _,/ (2... =/ _) |'))") - "_record_update" :: "'a => field_updates => 'b" ("_/(3'(| _ |'))" [900, 0] 900) - - -subsection \Record package\ - -ML_file "test.ML" - -hide_const (open) Tuple_Isomorphism repr abst iso_tuple_fst iso_tuple_snd - iso_tuple_fst_update iso_tuple_snd_update iso_tuple_cons - iso_tuple_surjective_proof_assist iso_tuple_update_accessor_cong_assist - iso_tuple_update_accessor_eq_assist tuple_iso_tuple - -end diff --git a/Isabelle_DOF-Unit-Tests/TestKit.thy b/Isabelle_DOF-Unit-Tests/TestKit.thy index 67f5dc7..3423834 100644 --- a/Isabelle_DOF-Unit-Tests/TestKit.thy +++ b/Isabelle_DOF-Unit-Tests/TestKit.thy @@ -23,6 +23,8 @@ keywords "text-" "text-latex" :: document_body and "update_instance-assert-error" :: document_body and "declare_reference-assert-error" :: document_body and "value-assert-error" :: document_body + and "definition-assert-error" :: document_body + and "doc_class-assert-error" :: document_body begin @@ -35,7 +37,8 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark xstring_opt:(xstring * Position.T) option), toks_list:Input.source list) : theory -> theory = - let val (((oid,pos),cid_pos), doc_attrs) = meta_args + let val ((binding,cid_pos), doc_attrs) = meta_args + val oid = Binding.name_of binding val oid' = if meta_args = ODL_Meta_Args_Parser.empty_meta_args then "output" else oid @@ -71,7 +74,7 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark else Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline = false} {define = true} - oid pos (cid_transform cid_pos) (attr_transform doc_attrs)) + binding (cid_transform cid_pos) (attr_transform doc_attrs)) (* ... generating the level-attribute syntax *) in handle_margs_opt #> (fn thy => (app (check_n_tex_text thy) toks_list; thy)) end; @@ -136,10 +139,10 @@ val _ = >> (Toplevel.theory o update_instance_command)); val _ = - let fun create_and_check_docitem ((((oid, pos),cid_pos),doc_attrs),src) thy = + let fun create_and_check_docitem (((binding,cid_pos),doc_attrs),src) thy = (Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline=true} - {define = false} oid pos (cid_pos) (doc_attrs) thy) + {define = false} binding (cid_pos) (doc_attrs) thy) handle ERROR msg => (if error_match src msg then (writeln ("Correct error: "^msg^": reported.");thy) else error"Wrong error reported") @@ -151,20 +154,55 @@ val _ = val _ = - let fun pass_trans_to_value_cmd (args, (((name, modes), t),src)) trans = - (Value_Command.value_cmd {assert=false} args name modes t @{here} trans - handle ERROR msg => (if error_match src msg - then (writeln ("Correct error: "^msg^": reported.");trans) - else error"Wrong error reported")) + let fun pass_trans_to_value_cmd (args, (((name, modes), t),src)) trans = + let val pos = Toplevel.pos_of trans + in trans |> Toplevel.theory + (fn thy => Value_Command.value_cmd {assert=false} args name modes t pos thy + handle ERROR msg => (if error_match src msg + then (writeln ("Correct error: "^msg^": reported."); thy) + else error"Wrong error reported")) + end in Outer_Syntax.command \<^command_keyword>\value-assert-error\ "evaluate and print term" (ODL_Meta_Args_Parser.opt_attributes -- (Value_Command.opt_evaluator -- Value_Command.opt_modes -- Parse.term -- Parse.document_source) - >> (Toplevel.theory o pass_trans_to_value_cmd)) + >> (pass_trans_to_value_cmd)) end; +val _ = + let fun definition_cmd' meta_args_opt decl params prems spec src bool ctxt = + Local_Theory.background_theory (Value_Command.meta_args_exec meta_args_opt) ctxt + |> (fn ctxt => Definition_Star_Command.definition_cmd decl params prems spec bool ctxt + handle ERROR msg => if error_match src msg + then (writeln ("Correct error: "^msg^": reported.") + ; pair "Bound 0" @{thm refl} + |> pair (Bound 0) + |> rpair ctxt) + else error"Wrong error reported") + in + Outer_Syntax.local_theory' \<^command_keyword>\definition-assert-error\ "constant definition" + (ODL_Meta_Args_Parser.opt_attributes -- + (Scan.option Parse_Spec.constdecl -- (Parse_Spec.opt_thm_name ":" -- Parse.prop) -- + Parse_Spec.if_assumes -- Parse.for_fixes -- Parse.document_source) + >> (fn (meta_args_opt, ((((decl, spec), prems), params), src)) => + #2 oo definition_cmd' meta_args_opt decl params prems spec src)) + end; + + +val _ = + let fun add_doc_class_cmd' ((((overloaded, hdr), (parent, attrs)),((rejects,accept_rex),invars)), src) = + (fn thy => OntoParser.add_doc_class_cmd {overloaded = overloaded} hdr parent attrs rejects accept_rex invars thy + handle ERROR msg => (if error_match src msg + then (writeln ("Correct error: "^msg^": reported."); thy) + else error"Wrong error reported")) + in + Outer_Syntax.command \<^command_keyword>\doc_class-assert-error\ + "define document class" + ((OntoParser.parse_doc_class -- Parse.document_source) + >> (Toplevel.theory o add_doc_class_cmd')) + end val _ = Outer_Syntax.command ("text-latex", \<^here>) "formal comment (primary style)" diff --git a/Isabelle_DOF-Unit-Tests/Test_Polymorphic_Classes.thy b/Isabelle_DOF-Unit-Tests/Test_Polymorphic_Classes.thy new file mode 100644 index 0000000..f6f255a --- /dev/null +++ b/Isabelle_DOF-Unit-Tests/Test_Polymorphic_Classes.thy @@ -0,0 +1,691 @@ +theory Test_Polymorphic_Classes + imports Isabelle_DOF.Isa_DOF + TestKit +begin + +doc_class title = + short_title :: "string option" <= "None" + +doc_class Author = + email :: "string" <= "''''" + +datatype classification = SIL0 | SIL1 | SIL2 | SIL3 | SIL4 + +doc_class abstract = + keywordlist :: "string list" <= "[]" + safety_level :: "classification" <= "SIL3" + +doc_class text_section = + authored_by :: "Author set" <= "{}" + level :: "int option" <= "None" + +doc_class ('a::one, 'b, 'c) test0 = text_section + + testa :: "'a list" + testb :: "'b list" + testc :: "'c list" + +typ\('a, 'b, 'c) test0\ +typ\('a, 'b, 'c, 'd) test0_scheme\ + +find_consts name:"test0" +find_theorems name:"test0" + + +doc_class 'a test1 = text_section + + test1 :: "'a list" + invariant author_finite_test :: "finite (authored_by \)" + invariant force_level_test :: "(level \) \ None \ the (level \) > 1" + +find_consts name:"test1*inv" +find_theorems name:"test1*inv" + +text*[church::Author, email="\b\"]\\ +text\@{Author "church"}\ +value*\@{Author \church\}\ + +text\\<^value_>\@{Author \church\}\\ + +doc_class ('a, 'b) test2 = "'a test1" + + test2 :: "'b list" +type_synonym ('a, 'b) test2_syn = "('a, 'b) test2" + +find_theorems name:"test2" + +declare [[invariants_checking_with_tactics]] +text*[testtest::"('a, int) test2", level = "Some 2", authored_by = "{@{Author \church\}}", test2 = "[1]"]\\ +value*\test2 @{test2 \testtest\}\ + +text*[testtest2''::"(nat, int) test2", test1 = "[2::nat, 3]", test2 = "[4::int, 5]", level = "Some (2::int)"]\\ +value*\test1 @{test2 \testtest2''\}\ +declare [[invariants_checking_with_tactics = false]] + +ML\ +val t = Syntax.parse_term \<^context> "@{test2 \testtest\}" + +\ +ML\ +val t = \<^term>\test2.make 8142730 Test_Parametric_Classes_2_test2_authored_by_Attribute_Not_Initialized Test_Parametric_Classes_2_test2_level_Attribute_Not_Initialized Test_Parametric_Classes_2_test2_test1_Attribute_Not_Initialized + Test_Parametric_Classes_2_test2_test2_Attribute_Not_Initialized + \authored_by := bot, level := None\ \ +\ + +text\test2 = "[1::'a::one]" should be test2 = "[1::int]" because the type of testtest4 is ('a::one, int) test2:\ +text-assert-error[testtest4::"('a::one, int) test2", level = "Some 2", authored_by = "{@{Author \church\}}", test2 = "[1::'a::one]"]\\ + \Type unification failed\ +text\Indeed this definition fails:\ +definition-assert-error testtest2::"('a::one, int) test2" where "testtest2 \ + test2.make 11953346 + {@{Author \church\}} + (Some 2) + [] + [] + \authored_by := bot + , level := None, level := Some 2 + , authored_by := insert \Author.tag_attribute = 11953164, email = []\ bot + , test2.test2 := [1::('a::one)]\ " +\Type unification failed\ + +text\For now, no more support of type synonyms as parent:\ +doc_class ('a, 'b, 'c) A = +a :: "'a list" +b :: "'b list" +c :: "'c list" +type_synonym ('a, 'b, 'c) A_syn = "('a, 'b, 'c) A" + +doc_class-assert-error ('a, 'b, 'c, 'd) B = "('b, 'c, 'd) A_syn" + +d ::"'a::one list" <= "[1]" +\Undefined onto class: "A_syn"\ + + +declare[[invariants_checking_with_tactics]] + +definition* testauthor0 where "testauthor0 \ \Author.tag_attribute = 5, email = \test_author_email\\" +definition* testauthor :: "Author" where "testauthor \ \Author.tag_attribute = 5, email = \test_author_email\\" +definition* testauthor2 :: "Author" where "testauthor2 \ \Author.tag_attribute = 5, email = \test_author_email\\ \email := \test_author_email_2\ \" +definition* testauthor3 :: "Author" where "testauthor3 \ testauthor \email := \test_author_email_2\ \" + +ML\ +val ctxt = \<^context> +val input0 = Syntax.read_input "@{Author \church\}" +val source = Syntax.read_input "\<^term_>\@{Author \church\}\" +val input = source +val tt = Document_Output.output_document ctxt {markdown = false} input +\ + +doc_class ('a, 'b) elaborate1 = +a :: "'a list" +b :: "'b list" + +doc_class ('a, 'b) elaborate2 = +c :: "('a, 'b) elaborate1 list" + +doc_class ('a, 'b) elaborate3 = +d :: "('a, 'b) elaborate2 list" + +text*[test_elaborate1::"('a::one, 'b) elaborate1", a = "[1]"]\\ + +term*\@{elaborate1 \test_elaborate1\}\ +value* [nbe]\@{elaborate1 \test_elaborate1\}\ + + +text*[test_elaborate2::"('a::one, 'b) elaborate2", c = "[@{elaborate1 \test_elaborate1\}]"]\\ + +text*[test_elaborate3::"('a::one, 'b) elaborate3", d = "[@{elaborate2 \test_elaborate2\}]"]\\ + +term*\(concat o concat) ((map o map) a (map c (elaborate3.d @{elaborate3 \test_elaborate3\})))\ +value*\(concat o concat) ((map o map) a (map c (elaborate3.d @{elaborate3 \test_elaborate3\})))\ + + +text\ +The term antiquotation is considered a ground term. +Then its type here is \<^typ>\'a::one list\ with \'a\ a fixed-type variable. +So the following definition only works because the parameter of the class is also \'a\.\ +declare[[ML_print_depth = 10000]] +doc_class 'a elaborate4 = +d :: "'a::one list" <= "(concat o concat) ((map o map) a (map c (elaborate3.d @{elaborate3 \test_elaborate3\})))" +declare[[ML_print_depth = 20]] + +declare[[ML_print_depth = 10000]] +text*[test_elaborate4::"'a::one elaborate4"]\\ +declare[[ML_print_depth = 20]] + + +text\Bug: +As the term antiquotation is considered as a ground term, +its type \<^typ>\'a::one list\ conflicts with the type of the attribute \<^typ>\int list\. +To support the instantiation of the term antiquotation as an \<^typ>\int list\, +the term antiquotation should have the same behavior as a constant definition, +which is not the case for now.\ +declare[[ML_print_depth = 10000]] +doc_class-assert-error elaborate4' = +d :: "int list" <= "(concat o concat) ((map o map) a (map c (elaborate3.d @{elaborate3 \test_elaborate3\})))" +\Type unification failed\ +declare[[ML_print_depth = 20]] + +text\The behavior we want to support: \ + +definition one_list :: "'a::one list" where "one_list \ [1]" + +text\the constant \<^const>\one_list\ can be instantiate as an \<^typ>\int list\:\ +doc_class elaborate4'' = +d :: "int list" <= "one_list" + +declare[[ML_print_depth = 10000]] +text*[test_elaborate4''::"elaborate4''"]\\ +declare[[ML_print_depth = 20]] + + +term*\concat (map a (elaborate2.c @{elaborate2 \test_elaborate2\}))\ +value*\concat (map a (elaborate2.c @{elaborate2 \test_elaborate2\}))\ + +text\ +The term antiquotation is considered a ground term. +Then its type here is \<^typ>\'a::one list\ with \'a\ a fixed-type variable. +So the following definition only works because the parameter of the class is also \'a\.\ +declare[[ML_print_depth = 10000]] +doc_class 'a elaborate5 = +d :: "'a::one list" <= "concat (map a (elaborate2.c @{elaborate2 \test_elaborate2\}))" +declare[[ML_print_depth = 20]] + +text\Bug: But when defining an instance, as we use a \'b\ variable to specify the type +of the instance (\<^typ>\'b::one elaborate5\, then the unification fails\ +declare[[ML_print_depth = 10000]] +text-assert-error[test_elaborate5::"'b::one elaborate5"]\\ +\Inconsistent sort constraints for type variable "'b"\ +declare[[ML_print_depth = 20]] + +text\Bug: +The term antiquotation is considered a ground term. +Then its type here is \<^typ>\'a::one list\ with \'a\ a fixed-type variable. +So it is not compatible with the type of the attribute \<^typ>\'a::numeral list\\ +doc_class-assert-error 'a elaborate5' = +d :: "'a::numeral list" <= "concat (map a (elaborate2.c @{elaborate2 \test_elaborate2\}))" +\Sort constraint\ + +text\The behavior we want to support: \ + +text\the constant \<^const>\one_list\ can be instantiate as an \<^typ>\'a::numeral list\:\ +doc_class 'a elaborate5'' = +d :: "'a::numeral list" <= "one_list" + + +text*[test_elaborate1a::"('a::one, int) elaborate1", a = "[1]", b = "[2]"]\\ + +term*\@{elaborate1 \test_elaborate1a\}\ +value* [nbe]\@{elaborate1 \test_elaborate1a\}\ + +text*[test_elaborate2a::"('a::one, int) elaborate2", c = "[@{elaborate1 \test_elaborate1a\}]"]\\ + +text*[test_elaborate3a::"('a::one, int) elaborate3", d = "[@{elaborate2 \test_elaborate2a\}]"]\\ + +text\ +The term antiquotation is considered a ground term. +Then its type here is \<^typ>\'a::one list\ with \'a\ a fixed-type variable. +So the following definition only works because the parameter of the class is also \'a\.\ +definition* test_elaborate3_embedding ::"'a::one list" + where "test_elaborate3_embedding \ (concat o concat) ((map o map) elaborate1.a (map elaborate2.c (elaborate3.d @{elaborate3 \test_elaborate3a\})))" + +text\Bug: +The term antiquotation is considered a ground term. +Then its type here is \<^typ>\'a::one list\ with \'a\ a fixed-type variable. +So it is not compatible with the specified type of the definition \<^typ>\int list\:\ +definition-assert-error test_elaborate3_embedding'::"int list" + where "test_elaborate3_embedding' \ (concat o concat) ((map o map) elaborate1.a (map elaborate2.c (elaborate3.d @{elaborate3 \test_elaborate3a\})))" +\Type unification failed\ + +term*\@{elaborate1 \test_elaborate1a\}\ +value* [nbe]\@{elaborate1 \test_elaborate1a\}\ + + +record ('a, 'b) elaborate1' = +a :: "'a list" +b :: "'b list" + +record ('a, 'b) elaborate2' = +c :: "('a, 'b) elaborate1' list" + +record ('a, 'b) elaborate3' = +d :: "('a, 'b) elaborate2' list" + +doc_class 'a one = +a::"'a list" + +text*[test_one::"'a::one one", a = "[1]"]\\ + +value* [nbe] \@{one \test_one\}\ + +term*\a @{one \test_one\}\ + +text\Bug: +The term antiquotation is considered a ground term. +Then its type here is \<^typ>\'a::one list\ with \'a\ a fixed-type variable. +So it is not compatible with the specified type of the definition \<^typ>\('b::one, 'a::numeral) elaborate1'\ +because the term antiquotation can not be instantiate as a \<^typ>\'b::one list\ +and the \'a\ is checked against the \'a::numeral\ instance type parameter:\ +definition-assert-error test_elaborate1'::"('b::one, 'a::numeral) elaborate1'" + where "test_elaborate1' \ \ elaborate1'.a = a @{one \test_one\}, b = [2] \" +\Sort constraint\ + +text\This is the same behavior as the following:\ +definition-assert-error test_elaborate10::"('b::one, 'a::numeral) elaborate1'" + where "test_elaborate10 \ \ elaborate1'.a = [1::'a::one], b = [2] \" +\Sort constraint\ + +definition-assert-error test_elaborate11::"('b::one, 'c::numeral) elaborate1'" + where "test_elaborate11 \ \ elaborate1'.a = [1::'a::one], b = [2] \" +\Type unification failed\ + +text\So this works:\ +definition* test_elaborate1''::"('a::one, 'b::numeral) elaborate1'" + where "test_elaborate1'' \ \ elaborate1'.a = a @{one \test_one\}, b = [2] \" + +term \elaborate1'.a test_elaborate1''\ +value [nbe] \elaborate1'.a test_elaborate1''\ + +text\But if we embed the term antiquotation in a definition, +the type unification works:\ +definition* onedef where "onedef \ @{one \test_one\}" + +definition test_elaborate1'''::"('b::one, 'a::numeral) elaborate1'" + where "test_elaborate1''' \ \ elaborate1'.a = a onedef, b = [2] \" + +value [nbe] \elaborate1'.a test_elaborate1'''\ + + +definition test_elaborate2'::"(int, 'b::numeral) elaborate2'" + where "test_elaborate2' \ \ c = [test_elaborate1''] \" + +definition test_elaborate3'::"(int, 'b::numeral) elaborate3'" + where "test_elaborate3' \ \ d = [test_elaborate2'] \" + + +doc_class 'a test3' = +test3 :: "int" +test3' :: "'a list" + +text*[testtest30::"'a::one test3'", test3'="[1]"]\\ +text-assert-error[testtest30::"'a test3'", test3'="[1]"]\\ +\Type unification failed: Variable\ + +find_consts name:"test3'.test3" +definition testone :: "'a::one test3'" where "testone \ \tag_attribute = 5, test3 = 3, test3' = [1] \" +definition* testtwo :: "'a::one test3'" where "testtwo \ \tag_attribute = 5, test3 = 1, test3' = [1] \\ test3 := 1\" + +text*[testtest3'::"'a test3'", test3 = "1"]\\ + +declare [[show_sorts = false]] +definition* testtest30 :: "'a test3'" where "testtest30 \ \tag_attribute = 12, test3 = 2, test3' = [] \" +update_instance*[testtest3'::"'a test3'", test3 := "2"] + +ML\ +val t = @{value_ [nbe] \test3 @{test3' \testtest3'\}\} +val tt = HOLogic.dest_number t +\ + +text\@{value_ [] [nbe] \test3 @{test3' \testtest3'\}\}\ + +update_instance*[testtest3'::"'a test3'", test3 += "2"] + +ML\ +val t = @{value_ [nbe] \test3 @{test3' \testtest3'\}\} +val tt = HOLogic.dest_number t +\ + +value\test3 \ tag_attribute = 1, test3 = 2, test3' = [2::int, 3] \\ +value\test3 \ tag_attribute = 1, test3 = 2, test3' = [2::int, 3] \\ +find_consts name:"test3'.test3" + +ML\ +val test_value = @{value_ \@{test3' \testtest3'\}\} + +\ +declare [[show_sorts = false]] +update_instance*[testtest3'::"'a test3'", test3 += "3"] +declare [[show_sorts = false]] +value*\test3 @{test3' \testtest3'\}\ +value\test3 \ tag_attribute = 12, test3 = 5, test3' = AAAAAA\\ + +find_consts name:"test3'.test3" + +text*[testtest3''::"int test3'", test3 = "1"]\\ + +update_instance*[testtest3''::"int test3'", test3' += "[3]"] + +value*\test3' @{test3' \testtest3''\}\ + +update_instance*[testtest3''::"int test3'", test3' := "[3]"] + +value*\test3' @{test3' \testtest3''\}\ + +update_instance*[testtest3''::"int test3'", test3' += "[2,5]"] + +value*\test3' @{test3' \testtest3''\}\ + +definition testeq where "testeq \ \x. x" +find_consts name:"test3'.ma" + +text-assert-error[testtest3''::"int test3'", test3 = "1", test3' = "[3::'a::numeral]"]\\ + \Type unification failed\ + +text-assert-error[testtest3''::"int test3'", test3 = "1", test3' = "[3]"]\\ + \Duplicate instance declaration\ + + +declare[[ML_print_depth = 10000]] +definition-assert-error testest3''' :: "int test3'" + where "testest3''' \ \ tag_attribute = 12, test3 = 1, test3' = [2]\\ test3' := [3::'a::numeral]\" +\Type unification failed\ +declare[[ML_print_depth = 20]] + +value* \test3 @{test3' \testtest3''\}\ +value* \\ tag_attribute = 12, test3 = 1, test3' = [2]\\ test3' := [3::int]\\ +value* \test3 (\ tag_attribute = 12, test3 = 1, test3' = [2]\\ test3' := [3::int]\)\ +term*\@{test3' \testtest3''\}\ + +ML\val t = \<^term_>\test3 @{test3' \testtest3''\}\\ + +value\test3 \ tag_attribute = 12, test3 = 2, test3' = [2::int ,3]\\ + +find_consts name:"test3'.test3" +find_consts name:"Isabelle_DOF_doc_class_test3'" +update_instance*[testtest3''::"int test3'", test3 := "2"] +ML\ +val t = @{value_ [nbe] \test3 @{test3' \testtest3''\}\} +val tt = HOLogic.dest_number t |> snd +\ + +doc_class 'a testa = +a:: "'a set" +b:: "int set" + +text*[testtesta::"'a testa", b = "{2}"]\\ +update_instance*[testtesta::"'a testa", b += "{3}"] + +ML\ +val t = @{value_ [nbe] \b @{testa \testtesta\}\} +val tt = HOLogic.dest_set t |> map (HOLogic.dest_number #> snd) +\ + +update_instance-assert-error[testtesta::"'a::numeral testa", a := "{2::'a::numeral}"] +\incompatible classes:'a Test_Polymorphic_Classes.testa:'a::numeral Test_Polymorphic_Classes.testa\ + +text*[testtesta'::"'a::numeral testa", a = "{2}"]\\ + +update_instance*[testtesta'::"'a::numeral testa", a += "{3}"] + +ML\ +val t = @{value_ [nbe] \a @{testa \testtesta'\}\} +\ + +update_instance-assert-error[testtesta'::"'a::numeral testa", a += "{3::int}"] + \Type unification failed\ + +definition-assert-error testtesta'' :: "'a::numeral testa" + where "testtesta'' \ \tag_attribute = 5, a = {1}, b = {1} \\ a := {1::int}\" +\Type unification failed\ + +update_instance*[testtesta'::"'a::numeral testa", b := "{3::int}"] +ML\ +val t = @{value_ [nbe] \b @{testa \testtesta'\}\} +\ + +value* [nbe] \b @{testa \testtesta'\}\ + +definition testtesta'' :: "'a::numeral testa" + where "testtesta'' \ \tag_attribute = 5, a = {1}, b = {1} \\ b := {2::int}\" + +value [nbe]\b testtesta''\ + +doc_class 'a test3 = +test3 :: "'a list" +type_synonym 'a test3_syn = "'a test3" + +text*[testtest3::"int test3", test3 = "[1]"]\\ +update_instance*[testtest3::"int test3", test3 := "[2]"] +ML\ +val t = \<^term_>\test3 @{test3 \testtest3\}\ +val tt = \<^value_>\test3 @{test3 \testtest3\}\ |> HOLogic.dest_list |> map HOLogic.dest_number +\ + +update_instance*[testtest3::"int test3", test3 += "[3]"] +value*\test3 @{test3 \testtest3\}\ + + +doc_class ('a, 'b) test4 = "'a test3" + + test4 :: "'b list" + +definition-assert-error testtest0'::"('a::one, int) test4" where "testtest0' \ + test4.make 11953346 + [] [1::('a::one)]" +\Type unification failed\ + +definition-assert-error testtest0''::"('a, int) test4" where "testtest0'' \ + test4.make 11953346 + [1] Test_Parametric_Classes_2_test4_test4_Attribute_Not_Initialized" +\Type unification failed\ + +text\Must fail because the equivalent definition +\testtest0'\ below fails +due to the constraint in the where [1::('a::one)] is not an \<^typ>\int list\ +but an \<^typ>\'a::one list\ list \ +text-assert-error[testtest0::"('a::one, int) test4", test4 = "[1::'a::one]"]\\ +\Type unification failed\ +update_instance-assert-error[testtest0::"('a::one, int) test4"] + \Undefined instance: "testtest0"\ + +value-assert-error\@{test4 \testtest0\}\\Undefined instance: "testtest0"\ + +definition testtest0''::"('a, int) test4" where "testtest0'' \ + \ tag_attribute = 11953346, test3 = [], test4 = [1]\\test4 := [2]\" + +definition testtest0'''::"('a, int) test4" where "testtest0''' \ + \ tag_attribute = 11953346, test3 = [], test4 = [1]\\test4 := [2]\" + + +value [nbe] \test3 testtest0''\ + +type_synonym notion = string + +doc_class Introduction = text_section + + authored_by :: "Author set" <= "UNIV" + uses :: "notion set" + invariant author_finite :: "finite (authored_by \)" + and force_level :: "(level \) \ None \ the (level \) > 1" + +doc_class claim = Introduction + + based_on :: "notion list" + +doc_class technical = text_section + + formal_results :: "thm list" + +doc_class "definition" = technical + + is_formal :: "bool" + property :: "term list" <= "[]" + +datatype kind = expert_opinion | argument | "proof" + +doc_class result = technical + + evidence :: kind + property :: "thm list" <= "[]" + invariant has_property :: "evidence \ = proof \ property \ \ []" + +doc_class example = technical + + referring_to :: "(notion + definition) set" <= "{}" + +doc_class conclusion = text_section + + establish :: "(claim \ result) set" + invariant establish_defined :: "\ x. x \ Domain (establish \) + \ (\ y \ Range (establish \). (x, y) \ establish \)" + +text\Next we define some instances (docitems): \ + +declare[[invariants_checking_with_tactics = true]] + +text*[church1::Author, email="\church@lambda.org\"]\\ + +text*[resultProof::result, evidence = "proof", property="[@{thm \HOL.refl\}]"]\\ +text*[resultArgument::result, evidence = "argument"]\\ + +text\The invariants \<^theory_text>\author_finite\ and \<^theory_text>\establish_defined\ can not be checked directly + and need a little help. + We can set the \invariants_checking_with_tactics\ theory attribute to help the checking. + It will enable a basic tactic, using unfold and auto:\ + +declare[[invariants_checking_with_tactics = true]] + +text*[curry::Author, email="\curry@lambda.org\"]\\ +text*[introduction2::Introduction, authored_by = "{@{Author \church\}}", level = "Some 2"]\\ +(* When not commented, should violated the invariant: +update_instance*[introduction2::Introduction + , authored_by := "{@{Author \church\}}" + , level := "Some 1"] +*) + +text*[introduction_test_parsed_elaborate::Introduction, authored_by = "authored_by @{Introduction \introduction2\}", level = "Some 2"]\\ +term*\authored_by @{Introduction \introduction_test_parsed_elaborate\}\ +value*\authored_by @{Introduction \introduction_test_parsed_elaborate\}\ +text*[introduction3::Introduction, authored_by = "{@{Author \church\}}", level = "Some 2"]\\ +text*[introduction4::Introduction, authored_by = "{@{Author \curry\}}", level = "Some 4"]\\ + +text*[resultProof2::result, evidence = "proof", property="[@{thm \HOL.sym\}]"]\\ + +text\Then we can evaluate expressions with instances:\ + +term*\authored_by @{Introduction \introduction2\} = authored_by @{Introduction \introduction3\}\ +value*\authored_by @{Introduction \introduction2\} = authored_by @{Introduction \introduction3\}\ +value*\authored_by @{Introduction \introduction2\} = authored_by @{Introduction \introduction4\}\ + +value*\@{Introduction \introduction2\}\ + +value*\{@{Author \curry\}} = {@{Author \church\}}\ + +term*\property @{result \resultProof\} = property @{result \resultProof2\}\ +value*\property @{result \resultProof\} = property @{result \resultProof2\}\ + +value*\evidence @{result \resultProof\} = evidence @{result \resultProof2\}\ + +declare[[invariants_checking_with_tactics = false]] + +declare[[invariants_strict_checking = false]] + +doc_class test_A = + level :: "int option" + x :: int + +doc_class test_B = + level :: "int option" + x :: "string" (* attributes live in their own name-space *) + y :: "string list" <= "[]" (* and can have arbitrary type constructors *) + (* LaTeX may have problems with this, though *) + +text\We may even use type-synonyms for class synonyms ...\ +type_synonym test_XX = test_B + +doc_class test_C0 = test_B + + z :: "test_A option" <= None (* A LINK, i.e. an attribute that has a type + referring to a document class. Mathematical + relations over document items can be modeled. *) + g :: "thm" (* a reference to the proxy-type 'thm' allowing + + to denote references to theorems inside attributes *) + + +doc_class test_C = test_B + + z :: "test_A option" <= None (* A LINK, i.e. an attribute that has a type + referring to a document class. Mathematical + relations over document items can be modeled. *) + g :: "thm" (* a reference to the proxy-type 'thm' allowing + + to denote references to theorems inside attributes *) + +datatype enum = X1 | X2 | X3 (* we add an enumeration type ... *) + + +doc_class test_D = test_B + + x :: "string" <= "\def \\" (* overriding default *) + a1 :: enum <= "X2" (* class - definitions may be mixed + with arbitrary HOL-commands, thus + also local definitions of enumerations *) + a2 :: int <= 0 + +doc_class test_E = test_D + + x :: "string" <= "''qed''" (* overriding default *) + +doc_class test_G = test_C + + g :: "thm" <= "@{thm \HOL.refl\}" (* warning overriding attribute expected*) + +doc_class 'a test_F = + properties :: "term list" + r :: "thm list" + u :: "file" + s :: "typ list" + b :: "(test_A \ 'a test_C_scheme) set" <= "{}" (* This is a relation link, roughly corresponding + to an association class. It can be used to track + claims to result - relations, for example.*) + b' :: "(test_A \ 'a test_C_scheme) list" <= "[]" + invariant br :: "r \ \ [] \ card(b \) \ 3" + and br':: "r \ \ [] \ length(b' \) \ 3" + and cr :: "properties \ \ []" + +lemma*[l::test_E] local_sample_lemma : + "@{thm \refl\} = @{thm ''refl''}" by simp + \ \un-evaluated references are similar to + uninterpreted constants. Not much is known + about them, but that doesn't mean that we + can't prove some basics over them...\ + +text*[xcv1::test_A, x=5]\Lorem ipsum ...\ +text*[xcv2::test_C, g="@{thm ''HOL.refl''}"]\Lorem ipsum ...\ +text*[xcv3::test_A, x=7]\Lorem ipsum ...\ + +text\Bug: For now, the implementation is no more compatible with the docitem term-antiquotation:\ +text-assert-error[xcv10::"unit test_F", r="[@{thm ''HOL.refl''}, + @{thm \local_sample_lemma\}]", (* long names required *) + b="{(@{docitem ''xcv1''},@{docitem \xcv2\})}", (* notations \...\ vs. ''...'' *) + s="[@{typ \int list\}]", + properties = "[@{term \H \ H\}]" (* notation \...\ required for UTF8*) +]\Lorem ipsum ...\\Type unification failed\ + +text*[xcv11::"unit test_F", r="[@{thm ''HOL.refl''}, + @{thm \local_sample_lemma\}]", (* long names required *) + b="{(@{test_A ''xcv1''},@{test_C \xcv2\})}", (* notations \...\ vs. ''...'' *) + s="[@{typ \int list\}]", + properties = "[@{term \H \ H\}]" (* notation \...\ required for UTF8*) +]\Lorem ipsum ...\ + +value*\b @{test_F \xcv11\}\ + +typ\unit test_F\ + +text*[xcv4::"unit test_F", r="[@{thm ''HOL.refl''}, + @{thm \local_sample_lemma\}]", (* long names required *) + b="{(@{test_A ''xcv1''},@{test_C \xcv2\})}", (* notations \...\ vs. ''...'' *) + s="[@{typ \int list\}]", + properties = "[@{term \H \ H\}]" (* notation \...\ required for UTF8*) +]\Lorem ipsum ...\ + +value*\b @{test_F \xcv4\}\ + +text*[xcv5::test_G, g="@{thm \HOL.sym\}"]\Lorem ipsum ...\ + +update_instance*[xcv4::"unit test_F", b+="{(@{test_A ''xcv3''},@{test_C ''xcv2''})}"] + +update_instance-assert-error[xcv4::"unit test_F", b+="{(@{test_A ''xcv3''},@{test_G ''xcv5''})}"] + \Type unification failed: Clash of types\ + + + +typ\unit test_G_ext\ +typ\\test_G.tag_attribute :: int\\ +text*[xcv6::"\test_G.tag_attribute :: int\ test_F", b="{(@{test_A ''xcv3''},@{test_G ''xcv5''})}"]\\ + + +text\\lemma*\, etc. do not support well polymorphic classes term antiquotations. +For now only embedded term-antiquotation in a definition could work:\ +definition* testtest_level where "testtest_level \ the (text_section.level @{test2 \testtest2''\})" +lemma*[e5::E] testtest : "xx + testtest_level = yy + testtest_level \ xx = yy" by simp + +text\Indeed this fails:\ +(*lemma*[e6::E] testtest : "xx + the (level @{test2 \testtest2''\}) = yy + the (level @{test2 \testtest2''\}) \ xx = yy" by simp*) + +end \ No newline at end of file diff --git a/Isabelle_DOF-Unit-Tests/test.ML b/Isabelle_DOF-Unit-Tests/test.ML deleted file mode 100644 index 10e350d..0000000 --- a/Isabelle_DOF-Unit-Tests/test.ML +++ /dev/null @@ -1,2454 +0,0 @@ -(* Title: HOL/Tools/record.ML - Author: Wolfgang Naraschewski, TU Muenchen - Author: Markus Wenzel, TU Muenchen - Author: Norbert Schirmer, TU Muenchen - Author: Norbert Schirmer, Apple, 2022 - Author: Thomas Sewell, NICTA - -Extensible records with structural subtyping. -*) - -signature RECORD = -sig - val type_abbr: bool Config.T - val type_as_fields: bool Config.T - val timing: bool Config.T - - type info = - {args: (string * sort) list, - parent: (typ list * string) option, - fields: (string * typ) list, - extension: (string * typ list), - ext_induct: thm, ext_inject: thm, ext_surjective: thm, ext_split: thm, ext_def: thm, - select_convs: thm list, update_convs: thm list, select_defs: thm list, update_defs: thm list, - fold_congs: thm list, unfold_congs: thm list, splits: thm list, defs: thm list, - surjective: thm, equality: thm, induct_scheme: thm, induct: thm, cases_scheme: thm, - cases: thm, simps: thm list, iffs: thm list} - val get_info: theory -> string -> info option - val the_info: theory -> string -> info - val get_hierarchy: theory -> (string * typ list) -> (string * ((string * sort) * typ) list) list - val add_record: {overloaded: bool} -> (string * sort) list * binding -> - (typ list * string) option -> (binding * typ * mixfix) list -> theory -> theory - - val last_extT: typ -> (string * typ list) option - val dest_recTs: typ -> (string * typ list) list - val get_extT_fields: theory -> typ -> (string * typ) list * (string * typ) - val get_recT_fields: theory -> typ -> (string * typ) list * (string * typ) - val get_parent: theory -> string -> (typ list * string) option - val get_extension: theory -> string -> (string * typ list) option - val get_extinjects: theory -> thm list - val get_simpset: theory -> simpset - val simproc: simproc - val eq_simproc: simproc - val upd_simproc: simproc - val split_simproc: (term -> int) -> simproc - val ex_sel_eq_simproc: simproc - val split_tac: Proof.context -> int -> tactic - val split_simp_tac: Proof.context -> thm list -> (term -> int) -> int -> tactic - val split_wrapper: string * (Proof.context -> wrapper) - - val pretty_recT: Proof.context -> typ -> Pretty.T - val string_of_record: Proof.context -> string -> string - - val codegen: bool Config.T - val sort_updates: bool Config.T - val updateN: string - val ext_typeN: string - val extN: string -end; - - -signature ISO_TUPLE_SUPPORT = -sig - val add_iso_tuple_type: {overloaded: bool} -> binding * (string * sort) list -> - typ * typ -> theory -> (term * term) * theory - val mk_cons_tuple: term * term -> term - val dest_cons_tuple: term -> term * term - val iso_tuple_intros_tac: Proof.context -> int -> tactic -end; - -structure Iso_Tuple_Support: ISO_TUPLE_SUPPORT = -struct - -val isoN = "_Tuple_Iso"; - -val iso_tuple_intro = @{thm isomorphic_tuple_intro}; -val iso_tuple_intros = Tactic.build_net @{thms isomorphic_tuple.intros}; - -val tuple_iso_tuple = (\<^const_name>\Test.tuple_iso_tuple\, @{thm tuple_iso_tuple}); - -structure Iso_Tuple_Thms = Theory_Data -( - type T = thm Symtab.table; - val empty = Symtab.make [tuple_iso_tuple]; - fun merge data = Symtab.merge Thm.eq_thm_prop data; (* FIXME handle Symtab.DUP ?? *) -); - -fun get_typedef_info tyco vs - (({rep_type, Abs_name, ...}, {Rep_inject, Abs_inverse, ... }) : Typedef.info) thy = - let - val exists_thm = - UNIV_I - |> Thm.instantiate' [SOME (Thm.global_ctyp_of thy (Logic.varifyT_global rep_type))] []; - val proj_constr = Abs_inverse OF [exists_thm]; - val absT = Type (tyco, map TFree vs); - in - thy - |> pair (tyco, ((Rep_inject, proj_constr), Const (Abs_name, rep_type --> absT), absT)) - end - -fun do_typedef overloaded raw_tyco repT raw_vs thy = - let - val ctxt = Proof_Context.init_global thy |> Variable.declare_typ repT; - val vs = map (Proof_Context.check_tfree ctxt) raw_vs; - in - thy - |> Named_Target.theory_map_result (apsnd o Typedef.transform_info) - (Typedef.add_typedef overloaded (raw_tyco, vs, NoSyn) - (HOLogic.mk_UNIV repT) NONE (fn ctxt' => resolve_tac ctxt' [UNIV_witness] 1)) - |-> (fn (tyco, info) => get_typedef_info tyco vs info) - end; - -fun mk_cons_tuple (t, u) = - let val (A, B) = apply2 fastype_of (t, u) - in \<^Const>\iso_tuple_cons \<^Type>\prod A B\ A B for \<^Const>\tuple_iso_tuple A B\ t u\ end; - -fun dest_cons_tuple \<^Const_>\iso_tuple_cons _ _ _ for \Const _\ t u\ = (t, u) - | dest_cons_tuple t = raise TERM ("dest_cons_tuple", [t]); - -fun add_iso_tuple_type overloaded (b, alphas) (leftT, rightT) thy = - let - val repT = HOLogic.mk_prodT (leftT, rightT); - - val ((_, ((rep_inject, abs_inverse), absC, absT)), typ_thy) = - thy - |> do_typedef overloaded b repT alphas - ||> Sign.add_path (Binding.name_of b); (*FIXME proper prefixing instead*) - val typ_ctxt = Proof_Context.init_global typ_thy; - - (*construct a type and body for the isomorphism constant by - instantiating the theorem to which the definition will be applied*) - val intro_inst = - rep_inject RS - infer_instantiate typ_ctxt - [(("abst", 0), Thm.cterm_of typ_ctxt absC)] iso_tuple_intro; - val (_, body) = Logic.dest_equals (List.last (Thm.prems_of intro_inst)); - val isomT = fastype_of body; - val isom_binding = Binding.suffix_name isoN b; - val isom_name = Sign.full_name typ_thy isom_binding; - val isom = Const (isom_name, isomT); - - val ([isom_def], cdef_thy) = - typ_thy - |> Sign.declare_const_global ((isom_binding, isomT), NoSyn) |> snd - |> Global_Theory.add_defs false - [((Binding.concealed (Thm.def_binding isom_binding), Logic.mk_equals (isom, body)), [])]; - - val iso_tuple = isom_def RS (abs_inverse RS (rep_inject RS iso_tuple_intro)); - val cons = \<^Const>\iso_tuple_cons absT leftT rightT\; - - val thm_thy = - cdef_thy - |> Iso_Tuple_Thms.map (Symtab.insert Thm.eq_thm_prop (isom_name, iso_tuple)) - |> Sign.restore_naming thy - in - ((isom, cons $ isom), thm_thy) - end; - -fun iso_tuple_intros_tac ctxt = - resolve_from_net_tac ctxt iso_tuple_intros THEN' - CSUBGOAL (fn (cgoal, i) => - let - val goal = Thm.term_of cgoal; - - val isthms = Iso_Tuple_Thms.get (Proof_Context.theory_of ctxt); - fun err s t = raise TERM ("iso_tuple_intros_tac: " ^ s, [t]); - - val goal' = Envir.beta_eta_contract goal; - val is = - (case goal' of - \<^Const_>\Trueprop for \<^Const>\isomorphic_tuple _ _ _ for \Const is\\\ => is - | _ => err "unexpected goal format" goal'); - val isthm = - (case Symtab.lookup isthms (#1 is) of - SOME isthm => isthm - | NONE => err "no thm found for constant" (Const is)); - in resolve_tac ctxt [isthm] i end); - -end; - - -structure Record: RECORD = -struct - -val surject_assistI = @{thm iso_tuple_surjective_proof_assistI}; -val surject_assist_idE = @{thm iso_tuple_surjective_proof_assist_idE}; - -val updacc_accessor_eqE = @{thm update_accessor_accessor_eqE}; -val updacc_updator_eqE = @{thm update_accessor_updator_eqE}; -val updacc_eq_idI = @{thm iso_tuple_update_accessor_eq_assist_idI}; -val updacc_eq_triv = @{thm iso_tuple_update_accessor_eq_assist_triv}; - -val updacc_foldE = @{thm update_accessor_congruence_foldE}; -val updacc_unfoldE = @{thm update_accessor_congruence_unfoldE}; -val updacc_noopE = @{thm update_accessor_noopE}; -val updacc_noop_compE = @{thm update_accessor_noop_compE}; -val updacc_cong_idI = @{thm update_accessor_cong_assist_idI}; -val updacc_cong_triv = @{thm update_accessor_cong_assist_triv}; -val updacc_cong_from_eq = @{thm iso_tuple_update_accessor_cong_from_eq}; - -val codegen = Attrib.setup_config_bool \<^binding>\record_codegen\ (K true); -val sort_updates = Attrib.setup_config_bool \<^binding>\record_sort_updates\ (K false); - - -(** name components **) - -val rN = "r"; -val wN = "w"; -val moreN = "more"; -val schemeN = "_scheme"; -val ext_typeN = "_ext"; -val inner_typeN = "_inner"; -val extN ="_ext"; -val updateN = "_update"; -val makeN = "make"; -val fields_selN = "fields"; -val extendN = "extend"; -val truncateN = "truncate"; - - - -(*** utilities ***) - -fun varifyT idx = map_type_tfree (fn (a, S) => TVar ((a, idx), S)); - - -(* timing *) - -val timing = Attrib.setup_config_bool \<^binding>\record_timing\ (K false); -fun timeit_msg ctxt s x = if Config.get ctxt timing then (warning s; timeit x) else x (); -fun timing_msg ctxt s = if Config.get ctxt timing then warning s else (); - - -(* syntax *) - -val Trueprop = HOLogic.mk_Trueprop; - -infix 0 :== ===; -infixr 0 ==>; - -val op :== = Misc_Legacy.mk_defpair; -val op === = Trueprop o HOLogic.mk_eq; -val op ==> = Logic.mk_implies; - - -(* constructor *) - -fun mk_ext (name, T) ts = - let val Ts = map fastype_of ts - in list_comb (Const (suffix extN name, Ts ---> T), ts) end; - - -(* selector *) - -fun mk_selC sT (c, T) = (c, sT --> T); - -fun mk_sel s (c, T) = - let val sT = fastype_of s - in Const (mk_selC sT (c, T)) $ s end; - - -(* updates *) - -fun mk_updC sfx sT (c, T) = (suffix sfx c, (T --> T) --> sT --> sT); - -fun mk_upd' sfx c v sT = - let val vT = domain_type (fastype_of v); - in Const (mk_updC sfx sT (c, vT)) $ v end; - -fun mk_upd sfx c v s = mk_upd' sfx c v (fastype_of s) $ s; - - -(* types *) - -fun dest_recT (typ as Type (c_ext_type, Ts as (_ :: _))) = - (case try (unsuffix ext_typeN) c_ext_type of - NONE => raise TYPE ("Record.dest_recT", [typ], []) - | SOME c => ((c, Ts), List.last Ts)) - | dest_recT typ = raise TYPE ("Record.dest_recT", [typ], []); - -val is_recT = can dest_recT; - -fun dest_recTs T = - let val ((c, Ts), U) = dest_recT T - in (c, Ts) :: dest_recTs U - end handle TYPE _ => []; - -fun last_extT T = - let val ((c, Ts), U) = dest_recT T in - (case last_extT U of - NONE => SOME (c, Ts) - | SOME l => SOME l) - end handle TYPE _ => NONE; - -fun rec_id i T = - let - val rTs = dest_recTs T; - val rTs' = if i < 0 then rTs else take i rTs; - in implode (map #1 rTs') end; - - - -(*** extend theory by record definition ***) - -(** record info **) - -(* type info and parent_info *) - -type info = - {args: (string * sort) list, - parent: (typ list * string) option, - fields: (string * typ) list, - extension: (string * typ list), - - ext_induct: thm, - ext_inject: thm, - ext_surjective: thm, - ext_split: thm, - ext_def: thm, - - select_convs: thm list, - update_convs: thm list, - select_defs: thm list, - update_defs: thm list, - fold_congs: thm list, (* potentially used in L4.verified *) - unfold_congs: thm list, (* potentially used in L4.verified *) - splits: thm list, - defs: thm list, - - surjective: thm, - equality: thm, - induct_scheme: thm, - induct: thm, - cases_scheme: thm, - cases: thm, - - simps: thm list, - iffs: thm list}; - -fun make_info args parent fields extension - ext_induct ext_inject ext_surjective ext_split ext_def - select_convs update_convs select_defs update_defs fold_congs unfold_congs splits defs - surjective equality induct_scheme induct cases_scheme cases - simps iffs : info = - {args = args, parent = parent, fields = fields, extension = extension, - ext_induct = ext_induct, ext_inject = ext_inject, ext_surjective = ext_surjective, - ext_split = ext_split, ext_def = ext_def, select_convs = select_convs, - update_convs = update_convs, select_defs = select_defs, update_defs = update_defs, - fold_congs = fold_congs, unfold_congs = unfold_congs, splits = splits, defs = defs, - surjective = surjective, equality = equality, induct_scheme = induct_scheme, - induct = induct, cases_scheme = cases_scheme, cases = cases, simps = simps, iffs = iffs}; - -type parent_info = - {name: string, - fields: (string * typ) list, - extension: (string * typ list), - induct_scheme: thm, - ext_def: thm}; - -fun make_parent_info name fields extension ext_def induct_scheme : parent_info = - {name = name, fields = fields, extension = extension, - ext_def = ext_def, induct_scheme = induct_scheme}; - - -(* theory data *) - -type data = - {records: info Symtab.table, - sel_upd: - {selectors: (int * bool) Symtab.table, - updates: string Symtab.table, - simpset: simpset, - defset: simpset}, - equalities: thm Symtab.table, - extinjects: thm list, - extsplit: thm Symtab.table, (*maps extension name to split rule*) - splits: (thm * thm * thm * thm) Symtab.table, (*!!, ALL, EX - split-equalities, induct rule*) - extfields: (string * typ) list Symtab.table, (*maps extension to its fields*) - fieldext: (string * typ list) Symtab.table}; (*maps field to its extension*) - -fun make_data - records sel_upd equalities extinjects extsplit splits extfields fieldext = - {records = records, sel_upd = sel_upd, - equalities = equalities, extinjects=extinjects, extsplit = extsplit, splits = splits, - extfields = extfields, fieldext = fieldext }: data; - -structure Data = Theory_Data -( - type T = data; - val empty = - make_data Symtab.empty - {selectors = Symtab.empty, updates = Symtab.empty, - simpset = HOL_basic_ss, defset = HOL_basic_ss} - Symtab.empty [] Symtab.empty Symtab.empty Symtab.empty Symtab.empty; - fun merge - ({records = recs1, - sel_upd = {selectors = sels1, updates = upds1, simpset = ss1, defset = ds1}, - equalities = equalities1, - extinjects = extinjects1, - extsplit = extsplit1, - splits = splits1, - extfields = extfields1, - fieldext = fieldext1}, - {records = recs2, - sel_upd = {selectors = sels2, updates = upds2, simpset = ss2, defset = ds2}, - equalities = equalities2, - extinjects = extinjects2, - extsplit = extsplit2, - splits = splits2, - extfields = extfields2, - fieldext = fieldext2}) = - make_data - (Symtab.merge (K true) (recs1, recs2)) - {selectors = Symtab.merge (K true) (sels1, sels2), - updates = Symtab.merge (K true) (upds1, upds2), - simpset = Simplifier.merge_ss (ss1, ss2), - defset = Simplifier.merge_ss (ds1, ds2)} - (Symtab.merge Thm.eq_thm_prop (equalities1, equalities2)) - (Thm.merge_thms (extinjects1, extinjects2)) - (Symtab.merge Thm.eq_thm_prop (extsplit1, extsplit2)) - (Symtab.merge (fn ((a, b, c, d), (w, x, y, z)) => - Thm.eq_thm (a, w) andalso Thm.eq_thm (b, x) andalso - Thm.eq_thm (c, y) andalso Thm.eq_thm (d, z)) (splits1, splits2)) - (Symtab.merge (K true) (extfields1, extfields2)) - (Symtab.merge (K true) (fieldext1, fieldext2)); -); - - -(* access 'records' *) - -val get_info = Symtab.lookup o #records o Data.get; - -fun the_info thy name = - (case get_info thy name of - SOME info => info - | NONE => error ("Unknown record type " ^ quote name)); - -fun put_record name info = - Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => - make_data (Symtab.update (name, info) records) - sel_upd equalities extinjects extsplit splits extfields fieldext); - - -(* access 'sel_upd' *) - -val get_sel_upd = #sel_upd o Data.get; - -val is_selector = Symtab.defined o #selectors o get_sel_upd; -val get_updates = Symtab.lookup o #updates o get_sel_upd; - -val get_simpset = #simpset o get_sel_upd; -val get_sel_upd_defs = #defset o get_sel_upd; - -fun get_update_details u thy = - let val sel_upd = get_sel_upd thy in - (case Symtab.lookup (#updates sel_upd) u of - SOME s => - let val SOME (dep, ismore) = Symtab.lookup (#selectors sel_upd) s - in SOME (s, dep, ismore) end - | NONE => NONE) - end; - -fun put_sel_upd names more depth simps defs thy = - let - val ctxt0 = Proof_Context.init_global thy; - - val all = names @ [more]; - val sels = map (rpair (depth, false)) names @ [(more, (depth, true))]; - val upds = map (suffix updateN) all ~~ all; - - val {records, sel_upd = {selectors, updates, simpset, defset}, - equalities, extinjects, extsplit, splits, extfields, fieldext} = Data.get thy; - val data = - make_data records - {selectors = fold Symtab.update_new sels selectors, - updates = fold Symtab.update_new upds updates, - simpset = simpset_map ctxt0 (fn ctxt => ctxt addsimps simps) simpset, - defset = simpset_map ctxt0 (fn ctxt => ctxt addsimps defs) defset} - equalities extinjects extsplit splits extfields fieldext; - in Data.put data thy end; - - -(* access 'equalities' *) - -fun add_equalities name thm = - Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => - make_data records sel_upd - (Symtab.update_new (name, thm) equalities) extinjects extsplit splits extfields fieldext); - -val get_equalities = Symtab.lookup o #equalities o Data.get; - - -(* access 'extinjects' *) - -fun add_extinjects thm = - Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => - make_data records sel_upd equalities (insert Thm.eq_thm_prop thm extinjects) - extsplit splits extfields fieldext); - -val get_extinjects = rev o #extinjects o Data.get; - - -(* access 'extsplit' *) - -fun add_extsplit name thm = - Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => - make_data records sel_upd equalities extinjects - (Symtab.update_new (name, thm) extsplit) splits extfields fieldext); - - -(* access 'splits' *) - -fun add_splits name thmP = - Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => - make_data records sel_upd equalities extinjects extsplit - (Symtab.update_new (name, thmP) splits) extfields fieldext); - -val get_splits = Symtab.lookup o #splits o Data.get; - - -(* parent/extension of named record *) - -val get_parent = (Option.join o Option.map #parent) oo (Symtab.lookup o #records o Data.get); -val get_extension = Option.map #extension oo (Symtab.lookup o #records o Data.get); - - -(* access 'extfields' *) - -fun add_extfields name fields = - Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => - make_data records sel_upd equalities extinjects extsplit splits - (Symtab.update_new (name, fields) extfields) fieldext); - -val get_extfields = Symtab.lookup o #extfields o Data.get; - -fun get_extT_fields thy T = - let - val ((name, Ts), moreT) = dest_recT T; - val recname = - let val (nm :: _ :: rst) = rev (Long_Name.explode name) (* FIXME !? *) - in Long_Name.implode (rev (nm :: rst)) end; - val varifyT = varifyT (maxidx_of_typs (moreT :: Ts) + 1); - val {records, extfields, ...} = Data.get thy; - val (fields, (more, _)) = split_last (Symtab.lookup_list extfields name); - val args = map varifyT (snd (#extension (the (Symtab.lookup records recname)))); - - val subst = fold (Sign.typ_match thy) (#1 (split_last args) ~~ #1 (split_last Ts)) Vartab.empty; - val fields' = map (apsnd (Envir.norm_type subst o varifyT)) fields; - in (fields', (more, moreT)) end; - -fun get_recT_fields thy T = - let - val (root_fields, (root_more, root_moreT)) = get_extT_fields thy T; - val (rest_fields, rest_more) = - if is_recT root_moreT then get_recT_fields thy root_moreT - else ([], (root_more, root_moreT)); - in (root_fields @ rest_fields, rest_more) end; - - -(* access 'fieldext' *) - -fun add_fieldext extname_types fields = - Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => - let - val fieldext' = - fold (fn field => Symtab.update_new (field, extname_types)) fields fieldext; - in make_data records sel_upd equalities extinjects extsplit splits extfields fieldext' end); - -val get_fieldext = Symtab.lookup o #fieldext o Data.get; - - -(* parent records *) - -local - -fun add_parents _ NONE = I - | add_parents thy (SOME (types, name)) = - let - fun err msg = error (msg ^ " parent record " ^ quote name); - - val {args, parent, ...} = - (case get_info thy name of SOME info => info | NONE => err "Unknown"); - val _ = if length types <> length args then err "Bad number of arguments for" else (); - - fun bad_inst ((x, S), T) = - if Sign.of_sort thy (T, S) then NONE else SOME x - val bads = map_filter bad_inst (args ~~ types); - val _ = null bads orelse err ("Ill-sorted instantiation of " ^ commas bads ^ " in"); - - val inst = args ~~ types; - val subst = Term.map_type_tfree (the o AList.lookup (op =) inst); - val parent' = Option.map (apfst (map subst)) parent; - in cons (name, inst) #> add_parents thy parent' end; - -in - -fun get_hierarchy thy (name, types) = add_parents thy (SOME (types, name)) []; - -fun get_parent_info thy parent = - add_parents thy parent [] |> map (fn (name, inst) => - let - val subst = Term.map_type_tfree (the o AList.lookup (op =) inst); - val {fields, extension, induct_scheme, ext_def, ...} = the_info thy name; - val fields' = map (apsnd subst) fields; - val extension' = apsnd (map subst) extension; - in make_parent_info name fields' extension' ext_def induct_scheme end); - -end; - - - -(** concrete syntax for records **) - -(* parse translations *) - -local - -fun split_args (field :: fields) ((name, arg) :: fargs) = - if can (unsuffix name) field then - let val (args, rest) = split_args fields fargs - in (arg :: args, rest) end - else raise Fail ("expecting field " ^ quote field ^ " but got " ^ quote name) - | split_args [] (fargs as (_ :: _)) = ([], fargs) - | split_args (_ :: _) [] = raise Fail "expecting more fields" - | split_args _ _ = ([], []); - -fun field_type_tr ((Const (\<^syntax_const>\_field_type\, _) $ Const (name, _) $ arg)) = - (name, arg) - | field_type_tr t = raise TERM ("field_type_tr", [t]); - -fun field_types_tr (Const (\<^syntax_const>\_field_types\, _) $ t $ u) = - field_type_tr t :: field_types_tr u - | field_types_tr t = [field_type_tr t]; - -fun record_field_types_tr more ctxt t = - let - val thy = Proof_Context.theory_of ctxt; - fun err msg = raise TERM ("Error in record-type input: " ^ msg, [t]); - - fun mk_ext (fargs as (name, _) :: _) = - (case get_fieldext thy (Proof_Context.intern_const ctxt name) of - SOME (ext, alphas) => - (case get_extfields thy ext of - SOME fields => - let - val (fields', _) = split_last fields; - val types = map snd fields'; - val (args, rest) = split_args (map fst fields') fargs - handle Fail msg => err msg; - val argtypes = Syntax.check_typs ctxt (map Syntax_Phases.decode_typ args); - val varifyT = varifyT (fold Term.maxidx_typ argtypes ~1 + 1); - val vartypes = map varifyT types; - - val subst = Type.raw_matches (vartypes, argtypes) Vartab.empty - handle Type.TYPE_MATCH => err "type is no proper record (extension)"; - val alphas' = - map (Syntax_Phases.term_of_typ ctxt o Envir.norm_type subst o varifyT) - (#1 (split_last alphas)); - - val more' = mk_ext rest; - in - list_comb - (Syntax.const (Lexicon.mark_type (suffix ext_typeN ext)), alphas' @ [more']) - end - | NONE => err ("no fields defined for " ^ quote ext)) - | NONE => err (quote name ^ " is no proper field")) - | mk_ext [] = more; - in - mk_ext (field_types_tr t) - end; - -fun record_type_tr ctxt [t] = record_field_types_tr (Syntax.const \<^type_syntax>\unit\) ctxt t - | record_type_tr _ ts = raise TERM ("record_type_tr", ts); - -fun record_type_scheme_tr ctxt [t, more] = record_field_types_tr more ctxt t - | record_type_scheme_tr _ ts = raise TERM ("record_type_scheme_tr", ts); - - -fun field_tr ((Const (\<^syntax_const>\_field\, _) $ Const (name, _) $ arg)) = (name, arg) - | field_tr t = raise TERM ("field_tr", [t]); - -fun fields_tr (Const (\<^syntax_const>\_fields\, _) $ t $ u) = field_tr t :: fields_tr u - | fields_tr t = [field_tr t]; - -fun record_fields_tr more ctxt t = - let - val thy = Proof_Context.theory_of ctxt; - fun err msg = raise TERM ("Error in record input: " ^ msg, [t]); - - fun mk_ext (fargs as (name, _) :: _) = - (case get_fieldext thy (Proof_Context.intern_const ctxt name) of - SOME (ext, _) => - (case get_extfields thy ext of - SOME fields => - let - val (args, rest) = split_args (map fst (fst (split_last fields))) fargs - handle Fail msg => err msg; - val more' = mk_ext rest; - in list_comb (Syntax.const (Lexicon.mark_const (ext ^ extN)), args @ [more']) end - | NONE => err ("no fields defined for " ^ quote ext)) - | NONE => err (quote name ^ " is no proper field")) - | mk_ext [] = more; - in mk_ext (fields_tr t) end; - -fun record_tr ctxt [t] = record_fields_tr (Syntax.const \<^const_syntax>\Unity\) ctxt t - | record_tr _ ts = raise TERM ("record_tr", ts); - -fun record_scheme_tr ctxt [t, more] = record_fields_tr more ctxt t - | record_scheme_tr _ ts = raise TERM ("record_scheme_tr", ts); - - -fun field_update_tr (Const (\<^syntax_const>\_field_update\, _) $ Const (name, _) $ arg) = - Syntax.const (suffix updateN name) $ Abs (Name.uu_, dummyT, arg) - | field_update_tr t = raise TERM ("field_update_tr", [t]); - -fun field_updates_tr (Const (\<^syntax_const>\_field_updates\, _) $ t $ u) = - field_update_tr t :: field_updates_tr u - | field_updates_tr t = [field_update_tr t]; - -fun record_update_tr [t, u] = fold (curry op $) (field_updates_tr u) t - | record_update_tr ts = raise TERM ("record_update_tr", ts); - -in - -val _ = - Theory.setup - (Sign.parse_translation - [(\<^syntax_const>\_record_update\, K record_update_tr), - (\<^syntax_const>\_record\, record_tr), - (\<^syntax_const>\_record_scheme\, record_scheme_tr), - (\<^syntax_const>\_record_type\, record_type_tr), - (\<^syntax_const>\_record_type_scheme\, record_type_scheme_tr)]); - -end; - - -(* print translations *) - -val type_abbr = Attrib.setup_config_bool \<^binding>\record_type_abbr\ (K true); -val type_as_fields = Attrib.setup_config_bool \<^binding>\record_type_as_fields\ (K true); - - -local - -(* FIXME early extern (!??) *) -(* FIXME Syntax.free (??) *) -fun field_type_tr' (c, t) = Syntax.const \<^syntax_const>\_field_type\ $ Syntax.const c $ t; - -fun field_types_tr' (t, u) = Syntax.const \<^syntax_const>\_field_types\ $ t $ u; - -fun record_type_tr' ctxt t = - let - val thy = Proof_Context.theory_of ctxt; - - val T = Syntax_Phases.decode_typ t; - val varifyT = varifyT (Term.maxidx_of_typ T + 1); - - fun strip_fields T = - (case T of - Type (ext, args as _ :: _) => - (case try (unsuffix ext_typeN) ext of - SOME ext' => - (case get_extfields thy ext' of - SOME (fields as (x, _) :: _) => - (case get_fieldext thy x of - SOME (_, alphas) => - (let - val (f :: fs, _) = split_last fields; - val fields' = - apfst (Proof_Context.extern_const ctxt) f :: - map (apfst Long_Name.base_name) fs; - val (args', more) = split_last args; - val alphavars = map varifyT (#1 (split_last alphas)); - val subst = Type.raw_matches (alphavars, args') Vartab.empty; - val fields'' = (map o apsnd) (Envir.norm_type subst o varifyT) fields'; - in fields'' @ strip_fields more end - handle Type.TYPE_MATCH => [("", T)]) - | _ => [("", T)]) - | _ => [("", T)]) - | _ => [("", T)]) - | _ => [("", T)]); - - val (fields, (_, moreT)) = split_last (strip_fields T); - val _ = null fields andalso raise Match; - val u = - foldr1 field_types_tr' - (map (field_type_tr' o apsnd (Syntax_Phases.term_of_typ ctxt)) fields); - in - if not (Config.get ctxt type_as_fields) orelse null fields then raise Match - else if moreT = HOLogic.unitT then Syntax.const \<^syntax_const>\_record_type\ $ u - else - Syntax.const \<^syntax_const>\_record_type_scheme\ $ u $ - Syntax_Phases.term_of_typ ctxt moreT - end; - -(*try to reconstruct the record name type abbreviation from - the (nested) extension types*) -fun record_type_abbr_tr' abbr alphas zeta last_ext schemeT ctxt tm = - let - val T = Syntax_Phases.decode_typ tm; - val varifyT = varifyT (maxidx_of_typ T + 1); - - fun mk_type_abbr subst name args = - let val abbrT = Type (name, map (varifyT o TFree) args) - in Syntax_Phases.term_of_typ ctxt (Envir.norm_type subst abbrT) end; - - fun match rT T = Type.raw_match (varifyT rT, T) Vartab.empty; - in - if Config.get ctxt type_abbr then - (case last_extT T of - SOME (name, _) => - if name = last_ext then - let val subst = match schemeT T in - if HOLogic.is_unitT (Envir.norm_type subst (varifyT (TFree zeta))) - then mk_type_abbr subst abbr alphas - else mk_type_abbr subst (suffix schemeN abbr) (alphas @ [zeta]) - end handle Type.TYPE_MATCH => record_type_tr' ctxt tm - else raise Match (*give print translation of specialised record a chance*) - | _ => raise Match) - else record_type_tr' ctxt tm - end; - -in - -fun record_ext_type_tr' name = - let - val ext_type_name = Lexicon.mark_type (suffix ext_typeN name); - fun tr' ctxt ts = - record_type_tr' ctxt (list_comb (Syntax.const ext_type_name, ts)); - in (ext_type_name, tr') end; - -fun record_ext_type_abbr_tr' abbr alphas zeta last_ext schemeT name = - let - val ext_type_name = Lexicon.mark_type (suffix ext_typeN name); - fun tr' ctxt ts = - record_type_abbr_tr' abbr alphas zeta last_ext schemeT ctxt - (list_comb (Syntax.const ext_type_name, ts)); - in (ext_type_name, tr') end; - -end; - - -local - -(* FIXME Syntax.free (??) *) -fun field_tr' (c, t) = Syntax.const \<^syntax_const>\_field\ $ Syntax.const c $ t; -fun fields_tr' (t, u) = Syntax.const \<^syntax_const>\_fields\ $ t $ u; - -fun record_tr' ctxt t = - let - val thy = Proof_Context.theory_of ctxt; - - fun strip_fields t = - (case strip_comb t of - (Const (ext, _), args as (_ :: _)) => - (case try (Lexicon.unmark_const o unsuffix extN) ext of - SOME ext' => - (case get_extfields thy ext' of - SOME fields => - (let - val (f :: fs, _) = split_last (map fst fields); - val fields' = Proof_Context.extern_const ctxt f :: map Long_Name.base_name fs; - val (args', more) = split_last args; - in (fields' ~~ args') @ strip_fields more end - handle ListPair.UnequalLengths => [("", t)]) - | NONE => [("", t)]) - | NONE => [("", t)]) - | _ => [("", t)]); - - val (fields, (_, more)) = split_last (strip_fields t); - val _ = null fields andalso raise Match; - val u = foldr1 fields_tr' (map field_tr' fields); - in - (case more of - Const (\<^const_syntax>\Unity\, _) => Syntax.const \<^syntax_const>\_record\ $ u - | _ => Syntax.const \<^syntax_const>\_record_scheme\ $ u $ more) - end; - -in - -fun record_ext_tr' name = - let - val ext_name = Lexicon.mark_const (name ^ extN); - fun tr' ctxt ts = record_tr' ctxt (list_comb (Syntax.const ext_name, ts)); - in (ext_name, tr') end; - -end; - - -local - -fun dest_update ctxt c = - (case try Lexicon.unmark_const c of - SOME d => try (unsuffix updateN) (Proof_Context.extern_const ctxt d) - | NONE => NONE); - -fun field_updates_tr' ctxt (tm as Const (c, _) $ k $ u) = - (case dest_update ctxt c of - SOME name => - (case try Syntax_Trans.const_abs_tr' k of - SOME t => - apfst (cons (Syntax.const \<^syntax_const>\_field_update\ $ Syntax.free name $ t)) - (field_updates_tr' ctxt u) - | NONE => ([], tm)) - | NONE => ([], tm)) - | field_updates_tr' _ tm = ([], tm); - -fun record_update_tr' ctxt tm = - (case field_updates_tr' ctxt tm of - ([], _) => raise Match - | (ts, u) => - Syntax.const \<^syntax_const>\_record_update\ $ u $ - foldr1 (fn (v, w) => Syntax.const \<^syntax_const>\_field_updates\ $ v $ w) (rev ts)); - -in - -fun field_update_tr' name = - let - val update_name = Lexicon.mark_const (name ^ updateN); - fun tr' ctxt [t, u] = record_update_tr' ctxt (Syntax.const update_name $ t $ u) - | tr' _ _ = raise Match; - in (update_name, tr') end; - -end; - - - -(** record simprocs **) - -fun is_sel_upd_pair thy (Const (s, _)) (Const (u, t')) = - (case get_updates thy u of - SOME u_name => u_name = s - | NONE => raise TERM ("is_sel_upd_pair: not update", [Const (u, t')])); - -fun mk_comp_id f = - let val T = range_type (fastype_of f) - in HOLogic.mk_comp (\<^Const>\id T\, f) end; - -fun get_upd_funs (upd $ _ $ t) = upd :: get_upd_funs t - | get_upd_funs _ = []; - -fun get_accupd_simps ctxt term defset = - let - val thy = Proof_Context.theory_of ctxt; - - val (acc, [body]) = strip_comb term; - val upd_funs = sort_distinct Term_Ord.fast_term_ord (get_upd_funs body); - fun get_simp upd = - let - (* FIXME fresh "f" (!?) *) - val T = domain_type (fastype_of upd); - val lhs = HOLogic.mk_comp (acc, upd $ Free ("f", T)); - val rhs = - if is_sel_upd_pair thy acc upd - then HOLogic.mk_comp (Free ("f", T), acc) - else mk_comp_id acc; - val prop = lhs === rhs; - val othm = - Goal.prove ctxt [] [] prop - (fn {context = ctxt', ...} => - simp_tac (put_simpset defset ctxt') 1 THEN - REPEAT_DETERM (Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1) THEN - TRY (simp_tac (put_simpset HOL_ss ctxt' addsimps @{thms id_apply id_o o_id}) 1)); - val dest = - if is_sel_upd_pair thy acc upd - then @{thm o_eq_dest} - else @{thm o_eq_id_dest}; - in Drule.export_without_context (othm RS dest) end; - in map get_simp upd_funs end; - -fun get_updupd_simp ctxt defset u u' comp = - let - (* FIXME fresh "f" (!?) *) - val f = Free ("f", domain_type (fastype_of u)); - val f' = Free ("f'", domain_type (fastype_of u')); - val lhs = HOLogic.mk_comp (u $ f, u' $ f'); - val rhs = - if comp - then u $ HOLogic.mk_comp (f, f') - else HOLogic.mk_comp (u' $ f', u $ f); - val prop = lhs === rhs; - val othm = - Goal.prove ctxt [] [] prop - (fn {context = ctxt', ...} => - simp_tac (put_simpset defset ctxt') 1 THEN - REPEAT_DETERM (Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1) THEN - TRY (simp_tac (put_simpset HOL_ss ctxt' addsimps @{thms id_apply}) 1)); - val dest = if comp then @{thm o_eq_dest_lhs} else @{thm o_eq_dest}; - in Drule.export_without_context (othm RS dest) end; - -fun gen_get_updupd_simps ctxt upd_funs defset = - let - val cname = fst o dest_Const; - fun getswap u u' = get_updupd_simp ctxt defset u u' (cname u = cname u'); - fun build_swaps_to_eq _ [] swaps = swaps - | build_swaps_to_eq upd (u :: us) swaps = - let - val key = (cname u, cname upd); - val newswaps = - if Symreltab.defined swaps key then swaps - else Symreltab.insert (K true) (key, getswap u upd) swaps; - in - if cname u = cname upd then newswaps - else build_swaps_to_eq upd us newswaps - end; - fun swaps_needed [] _ _ swaps = map snd (Symreltab.dest swaps) - | swaps_needed (u :: us) prev seen swaps = - if Symtab.defined seen (cname u) - then swaps_needed us prev seen (build_swaps_to_eq u prev swaps) - else swaps_needed us (u :: prev) (Symtab.insert (K true) (cname u, ()) seen) swaps; - in swaps_needed upd_funs [] Symtab.empty Symreltab.empty end; - -fun get_updupd_simps ctxt term defset = gen_get_updupd_simps ctxt (get_upd_funs term) defset; - -fun prove_unfold_defs thy upd_funs ex_simps ex_simprs prop = - let - val ctxt = Proof_Context.init_global thy; - - val defset = get_sel_upd_defs thy; - val prop' = Envir.beta_eta_contract prop; - val (lhs, _) = Logic.dest_equals (Logic.strip_assums_concl prop'); - val (_, args) = strip_comb lhs; - val simps = - if null upd_funs then - (if length args = 1 then get_accupd_simps else get_updupd_simps) ctxt lhs defset - else gen_get_updupd_simps ctxt upd_funs defset - in - Goal.prove ctxt [] [] prop' - (fn {context = ctxt', ...} => - simp_tac (put_simpset HOL_basic_ss ctxt' addsimps (simps @ @{thms K_record_comp})) 1 THEN - TRY (simp_tac (put_simpset HOL_basic_ss ctxt' addsimps ex_simps addsimprocs ex_simprs) 1)) - end; - - -local - -fun eq (s1: string) (s2: string) = (s1 = s2); - -fun has_field extfields f T = - exists (fn (eN, _) => exists (eq f o fst) (Symtab.lookup_list extfields eN)) (dest_recTs T); - -fun K_skeleton n (T as Type (_, [_, kT])) (b as Bound i) (Abs (x, xT, t)) = - if null (loose_bnos t) then ((n, kT), (Abs (x, xT, Bound (i + 1)))) else ((n, T), b) - | K_skeleton n T b _ = ((n, T), b); - -in - -(* simproc *) - -(* - Simplify selections of an record update: - (1) S (S_update k r) = k (S r) - (2) S (X_update k r) = S r - - The simproc skips multiple updates at once, eg: - S (X_update x (Y_update y (S_update k r))) = k (S r) - - But be careful in (2) because of the extensibility of records. - - If S is a more-selector we have to make sure that the update on component - X does not affect the selected subrecord. - - If X is a more-selector we have to make sure that S is not in the updated - subrecord. -*) -val _ = Theory.setup (Named_Target.theory_map (Simplifier.define_simproc \<^binding>\select_update\ - {lhss = [\<^term>\x::'a::{}\], - proc = fn _ => fn ctxt => fn ct => - let - val thy = Proof_Context.theory_of ctxt; - val t = Thm.term_of ct; - in - (case t of - (sel as Const (s, Type (_, [_, rangeS]))) $ - ((upd as Const (u, Type (_, [_, Type (_, [rT, _])]))) $ k $ r) => - if is_selector thy s andalso is_some (get_updates thy u) then - let - val {sel_upd = {updates, ...}, extfields, ...} = Data.get thy; - - fun mk_eq_terms ((upd as Const (u, Type(_, [kT, _]))) $ k $ r) = - (case Symtab.lookup updates u of - NONE => NONE - | SOME u_name => - if u_name = s then - (case mk_eq_terms r of - NONE => - let - val rv = ("r", rT); - val rb = Bound 0; - val (kv, kb) = K_skeleton "k" kT (Bound 1) k; - in SOME (upd $ kb $ rb, kb $ (sel $ rb), [kv, rv]) end - | SOME (trm, trm', vars) => - let - val (kv, kb) = K_skeleton "k" kT (Bound (length vars)) k; - in SOME (upd $ kb $ trm, kb $ trm', kv :: vars) end) - else if has_field extfields u_name rangeS orelse - has_field extfields s (domain_type kT) then NONE - else - (case mk_eq_terms r of - SOME (trm, trm', vars) => - let val (kv, kb) = K_skeleton "k" kT (Bound (length vars)) k - in SOME (upd $ kb $ trm, trm', kv :: vars) end - | NONE => - let - val rv = ("r", rT); - val rb = Bound 0; - val (kv, kb) = K_skeleton "k" kT (Bound 1) k; - in SOME (upd $ kb $ rb, sel $ rb, [kv, rv]) end)) - | mk_eq_terms _ = NONE; - in - (case mk_eq_terms (upd $ k $ r) of - SOME (trm, trm', vars) => - SOME - (prove_unfold_defs thy [] [] [] - (Logic.list_all (vars, Logic.mk_equals (sel $ trm, trm')))) - | NONE => NONE) - end - else NONE - | _ => NONE) - end})); - -val simproc_name = - Simplifier.check_simproc (Context.the_local_context ()) ("select_update", Position.none); -val simproc = Simplifier.the_simproc (Context.the_local_context ()) simproc_name; - -fun get_upd_acc_cong_thm upd acc thy ss = - let - val ctxt = Proof_Context.init_global thy; - val prop = - infer_instantiate ctxt - [(("upd", 0), Thm.cterm_of ctxt upd), (("ac", 0), Thm.cterm_of ctxt acc)] - updacc_cong_triv - |> Thm.concl_of; - in - Goal.prove ctxt [] [] prop - (fn {context = ctxt', ...} => - simp_tac (put_simpset ss ctxt') 1 THEN - REPEAT_DETERM (Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1) THEN - TRY (resolve_tac ctxt' [updacc_cong_idI] 1)) - end; - -fun sorted ord [] = true - | sorted ord [x] = true - | sorted ord (x::y::xs) = - (case ord (x, y) of - LESS => sorted ord (y::xs) - | EQUAL => sorted ord (y::xs) - | GREATER => false) - -fun insert_unique ord x [] = [x] - | insert_unique ord x (y::ys) = - (case ord (x, y) of - LESS => (x::y::ys) - | EQUAL => (x::ys) - | GREATER => y :: insert_unique ord x ys) - -fun insert_unique_hd ord (x::xs) = x :: insert_unique ord x xs - | insert_unique_hd ord xs = xs - - -(* upd_simproc *) - -(*Simplify multiple updates: - (1) "N_update y (M_update g (N_update x (M_update f r))) = - (N_update (y o x) (M_update (g o f) r))" - (2) "r(|M:= M r|) = r" - - In both cases "more" updates complicate matters: for this reason - we omit considering further updates if doing so would introduce - both a more update and an update to a field within it.*) -val _ = Theory.setup (Named_Target.theory_map (Simplifier.define_simproc \<^binding>\update\ - {lhss = [\<^term>\x::'a::{}\], - proc = fn _ => fn ctxt => fn ct => - let - val thy = Proof_Context.theory_of ctxt; - val t = Thm.term_of ct; - (*We can use more-updators with other updators as long - as none of the other updators go deeper than any more - updator. min here is the depth of the deepest other - updator, max the depth of the shallowest more updator.*) - fun include_depth (dep, true) (min, max) = - if min <= dep - then SOME (min, if dep <= max orelse max = ~1 then dep else max) - else NONE - | include_depth (dep, false) (min, max) = - if dep <= max orelse max = ~1 - then SOME (if min <= dep then dep else min, max) - else NONE; - - fun getupdseq (term as (upd as Const (u, _)) $ f $ tm) min max = - (case get_update_details u thy of - SOME (s, dep, ismore) => - (case include_depth (dep, ismore) (min, max) of - SOME (min', max') => - let val (us, bs, _) = getupdseq tm min' max' - in ((upd, s, f) :: us, bs, fastype_of term) end - | NONE => ([], term, HOLogic.unitT)) - | NONE => ([], term, HOLogic.unitT)) - | getupdseq term _ _ = ([], term, HOLogic.unitT); - - val (upds, base, baseT) = getupdseq t 0 ~1; - val orig_upds = map_index (fn (i, (x, y, z)) => (x, y, z, i)) upds - val upd_ord = rev_order o fast_string_ord o apply2 #2 - val (upds, commuted) = - if not (null orig_upds) andalso Config.get ctxt sort_updates andalso not (sorted upd_ord orig_upds) then - (sort upd_ord orig_upds, true) - else - (orig_upds, false) - - fun is_upd_noop s (Abs (n, T, Const (s', T') $ tm')) tm = - if s = s' andalso null (loose_bnos tm') - andalso subst_bound (HOLogic.unit, tm') = tm - then (true, Abs (n, T, Const (s', T') $ Bound 1)) - else (false, HOLogic.unit) - | is_upd_noop _ _ _ = (false, HOLogic.unit); - - fun get_noop_simps (upd as Const _) (Abs (_, _, (acc as Const _) $ _)) = - let - val ss = get_sel_upd_defs thy; - val uathm = get_upd_acc_cong_thm upd acc thy ss; - in - [Drule.export_without_context (uathm RS updacc_noopE), - Drule.export_without_context (uathm RS updacc_noop_compE)] - end; - - (*If f is constant then (f o g) = f. We know that K_skeleton - only returns constant abstractions thus when we see an - abstraction we can discard inner updates.*) - fun add_upd (f as Abs _) _ = [f] - | add_upd f fs = (f :: fs); - - (*mk_updterm returns - (orig-term-skeleton-update list , simplified-skeleton, - variables, duplicate-updates, simp-flag, noop-simps) - - where duplicate-updates is a table used to pass upward - the list of update functions which can be composed - into an update above them, simp-flag indicates whether - any simplification was achieved, and noop-simps are - used for eliminating case (2) defined above*) - fun mk_updterm ((upd as Const (u, T), s, f, i) :: upds) above term = - let - val (lhs_upds, rhs, vars, dups, simp, noops) = - mk_updterm upds (Symtab.update (u, ()) above) term; - val (fvar, skelf) = - K_skeleton (Long_Name.base_name s) (domain_type T) (Bound (length vars)) f; - val (isnoop, skelf') = is_upd_noop s f term; - val funT = domain_type T; - fun mk_comp_local (f, f') = - Const (\<^const_name>\Fun.comp\, funT --> funT --> funT) $ f $ f'; - in - if isnoop then - ((upd $ skelf', i)::lhs_upds, rhs, vars, - Symtab.update (u, []) dups, true, - if Symtab.defined noops u then noops - else Symtab.update (u, get_noop_simps upd skelf') noops) - else if Symtab.defined above u then - ((upd $ skelf, i)::lhs_upds, rhs, fvar :: vars, - Symtab.map_default (u, []) (add_upd skelf) dups, - true, noops) - else - (case Symtab.lookup dups u of - SOME fs => - ((upd $ skelf, i)::lhs_upds, - upd $ foldr1 mk_comp_local (add_upd skelf fs) $ rhs, - fvar :: vars, dups, true, noops) - | NONE => ((upd $ skelf, i)::lhs_upds, upd $ skelf $ rhs, fvar :: vars, dups, simp, noops)) - end - | mk_updterm [] _ _ = - ([], Bound 0, [("r", baseT)], Symtab.empty, false, Symtab.empty) - | mk_updterm us _ _ = raise TERM ("mk_updterm match", map (fn (x, _, _, _) => x) us); - - val (lhs_upds, rhs, vars, _, simp, noops) = mk_updterm upds Symtab.empty base; - val orig_order_lhs_upds = lhs_upds |> sort (rev_order o int_ord o apply2 snd) - val lhs = Bound 0 |> fold (fn (upd, _) => fn s => upd $ s) orig_order_lhs_upds - (* Note that the simplifier works bottom up. So all nested updates are already - normalised, e.g. sorted. 'commuted' thus means that the outermost update has to be - inserted at its place inside the sorted nested updates. The necessary swaps can be - expressed via 'upd_funs' by replicating the outer update at the designated position: *) - val upd_funs = (if commuted then insert_unique_hd upd_ord orig_upds else orig_upds) |> map #1 - val noops' = maps snd (Symtab.dest noops); - in - if simp orelse commuted then - SOME - (prove_unfold_defs thy upd_funs noops' [simproc] - (Logic.list_all (vars, Logic.mk_equals (lhs, rhs)))) - else NONE - end})); - -val upd_simproc_name = - Simplifier.check_simproc (Context.the_local_context ()) ("update", Position.none); -val upd_simproc = Simplifier.the_simproc (Context.the_local_context ()) upd_simproc_name; - -end; - - -(* eq_simproc *) - -(*Look up the most specific record-equality. - - Note on efficiency: - Testing equality of records boils down to the test of equality of all components. - Therefore the complexity is: #components * complexity for single component. - Especially if a record has a lot of components it may be better to split up - the record first and do simplification on that (split_simp_tac). - e.g. r(|lots of updates|) = x - - eq_simproc split_simp_tac - Complexity: #components * #updates #updates -*) - -val _ = Theory.setup (Named_Target.theory_map (Simplifier.define_simproc \<^binding>\eq\ - {lhss = [\<^term>\r = s\], - proc = fn _ => fn ctxt => fn ct => - (case Thm.term_of ct of - \<^Const_>\HOL.eq T for _ _\ => - (case rec_id ~1 T of - "" => NONE - | name => - (case get_equalities (Proof_Context.theory_of ctxt) name of - NONE => NONE - | SOME thm => SOME (thm RS @{thm Eq_TrueI}))) - | _ => NONE)})); - -val eq_simproc_name = Simplifier.check_simproc (Context.the_local_context ()) ("eq", Position.none); -val eq_simproc = Simplifier.the_simproc (Context.the_local_context ()) eq_simproc_name; - -(* split_simproc *) - -(*Split quantified occurrences of records, for which P holds. P can peek on the - subterm starting at the quantified occurrence of the record (including the quantifier): - P t = 0: do not split - P t = ~1: completely split - P t > 0: split up to given bound of record extensions.*) -fun split_simproc P = - Simplifier.make_simproc \<^context> "record_split" - {lhss = [\<^term>\x::'a::{}\], - proc = fn _ => fn ctxt => fn ct => - (case Thm.term_of ct of - Const (quantifier, Type (_, [Type (_, [T, _]), _])) $ _ => - if quantifier = \<^const_name>\Pure.all\ orelse - quantifier = \<^const_name>\All\ orelse - quantifier = \<^const_name>\Ex\ - then - (case rec_id ~1 T of - "" => NONE - | _ => - let val split = P (Thm.term_of ct) in - if split <> 0 then - (case get_splits (Proof_Context.theory_of ctxt) (rec_id split T) of - NONE => NONE - | SOME (all_thm, All_thm, Ex_thm, _) => - SOME - (case quantifier of - \<^const_name>\Pure.all\ => all_thm - | \<^const_name>\All\ => All_thm RS @{thm eq_reflection} - | \<^const_name>\Ex\ => Ex_thm RS @{thm eq_reflection} - | _ => raise Fail "split_simproc")) - else NONE - end) - else NONE - | _ => NONE)}; - -val _ = Theory.setup (Named_Target.theory_map (Simplifier.define_simproc \<^binding>\ex_sel_eq\ - {lhss = [\<^term>\Ex t\], - proc = fn _ => fn ctxt => fn ct => - let - val thy = Proof_Context.theory_of ctxt; - val t = Thm.term_of ct; - fun mkeq (lr, T, (sel, Tsel), x) i = - if is_selector thy sel then - let - val x' = - if not (Term.is_dependent x) - then Free ("x" ^ string_of_int i, range_type Tsel) - else raise TERM ("", [x]); - val sel' = Const (sel, Tsel) $ Bound 0; - val (l, r) = if lr then (sel', x') else (x', sel'); - in \<^Const>\HOL.eq T for l r\ end - else raise TERM ("", [Const (sel, Tsel)]); - - fun dest_sel_eq (\<^Const_>\HOL.eq T\ $ (Const (sel, Tsel) $ Bound 0) $ X) = - (true, T, (sel, Tsel), X) - | dest_sel_eq (\<^Const_>\HOL.eq T\ $ X $ (Const (sel, Tsel) $ Bound 0)) = - (false, T, (sel, Tsel), X) - | dest_sel_eq _ = raise TERM ("", []); - in - (case t of - \<^Const_>\Ex T for \Abs (s, _, t)\\ => - (let - val eq = mkeq (dest_sel_eq t) 0; - val prop = - Logic.list_all ([("r", T)], - Logic.mk_equals (\<^Const>\Ex T for \Abs (s, T, eq)\\, \<^Const>\True\)); - in - SOME (Goal.prove_sorry_global thy [] [] prop - (fn {context = ctxt', ...} => - simp_tac (put_simpset (get_simpset thy) ctxt' - addsimps @{thms simp_thms} addsimprocs [split_simproc (K ~1)]) 1)) - end handle TERM _ => NONE) - | _ => NONE) - end})); - -val ex_sel_eq_simproc_name = - Simplifier.check_simproc (Context.the_local_context ()) ("ex_sel_eq", Position.none); -val ex_sel_eq_simproc = Simplifier.the_simproc (Context.the_local_context ()) ex_sel_eq_simproc_name; -val _ = Theory.setup (map_theory_simpset (fn ctxt => ctxt delsimprocs [ex_sel_eq_simproc])); - - -(* split_simp_tac *) - -(*Split (and simplify) all records in the goal for which P holds. - For quantified occurrences of a record - P can peek on the whole subterm (including the quantifier); for free variables P - can only peek on the variable itself. - P t = 0: do not split - P t = ~1: completely split - P t > 0: split up to given bound of record extensions.*) -fun split_simp_tac ctxt thms P = CSUBGOAL (fn (cgoal, i) => - let - val thy = Proof_Context.theory_of ctxt; - - val goal = Thm.term_of cgoal; - val frees = filter (is_recT o #2) (Term.add_frees goal []); - - val has_rec = exists_Const - (fn (s, Type (_, [Type (_, [T, _]), _])) => - (s = \<^const_name>\Pure.all\ orelse s = \<^const_name>\All\ orelse s = \<^const_name>\Ex\) - andalso is_recT T - | _ => false); - - fun mk_split_free_tac free induct_thm i = - let - val _ $ (_ $ Var (r, _)) = Thm.concl_of induct_thm; - val thm = infer_instantiate ctxt [(r, Thm.cterm_of ctxt free)] induct_thm; - in - simp_tac (put_simpset HOL_basic_ss ctxt addsimps @{thms induct_atomize}) i THEN - resolve_tac ctxt [thm] i THEN - simp_tac (put_simpset HOL_basic_ss ctxt addsimps @{thms induct_rulify}) i - end; - - val split_frees_tacs = - frees |> map_filter (fn (x, T) => - (case rec_id ~1 T of - "" => NONE - | _ => - let - val free = Free (x, T); - val split = P free; - in - if split <> 0 then - (case get_splits thy (rec_id split T) of - NONE => NONE - | SOME (_, _, _, induct_thm) => - SOME (mk_split_free_tac free induct_thm i)) - else NONE - end)); - - val simprocs = if has_rec goal then [split_simproc P] else []; - val thms' = @{thms o_apply K_record_comp} @ thms; - in - EVERY split_frees_tacs THEN - full_simp_tac (put_simpset (get_simpset thy) ctxt addsimps thms' addsimprocs simprocs) i - end); - - -(* split_tac *) - -(*Split all records in the goal, which are quantified by !! or ALL.*) -fun split_tac ctxt = CSUBGOAL (fn (cgoal, i) => - let - val goal = Thm.term_of cgoal; - - val has_rec = exists_Const - (fn (s, Type (_, [Type (_, [T, _]), _])) => - (s = \<^const_name>\Pure.all\ orelse s = \<^const_name>\All\) andalso is_recT T - | _ => false); - - fun is_all (Const (\<^const_name>\Pure.all\, _) $ _) = ~1 - | is_all (Const (\<^const_name>\All\, _) $ _) = ~1 - | is_all _ = 0; - in - if has_rec goal then - full_simp_tac (put_simpset HOL_basic_ss ctxt addsimprocs [split_simproc is_all]) i - else no_tac - end); - -(* wrapper *) - -val split_name = "record_split_tac"; -val split_wrapper = (split_name, fn ctxt => fn tac => split_tac ctxt ORELSE' tac); - - - -(** theory extender interface **) - -(* attributes *) - -fun case_names_fields x = Rule_Cases.case_names ["fields"] x; -fun induct_type_global name = [case_names_fields, Induct.induct_type name]; -fun cases_type_global name = [case_names_fields, Induct.cases_type name]; - - -(* tactics *) - -(*Do case analysis / induction according to rule on last parameter of ith subgoal - (or on s if there are no parameters). - Instatiation of record variable (and predicate) in rule is calculated to - avoid problems with higher order unification.*) -fun try_param_tac ctxt s rule = CSUBGOAL (fn (cgoal, i) => - let - val g = Thm.term_of cgoal; - val params = Logic.strip_params g; - val concl = HOLogic.dest_Trueprop (Logic.strip_assums_concl g); - val rule' = Thm.lift_rule cgoal rule; - val (P, ys) = strip_comb (HOLogic.dest_Trueprop (Logic.strip_assums_concl (Thm.prop_of rule'))); - (*ca indicates if rule is a case analysis or induction rule*) - val (x, ca) = - (case rev (drop (length params) ys) of - [] => (head_of (fst (HOLogic.dest_eq (HOLogic.dest_Trueprop - (hd (rev (Logic.strip_assums_hyp (hd (Thm.prems_of rule')))))))), true) - | [x] => (head_of x, false)); - val rule'' = - infer_instantiate ctxt - (map (apsnd (Thm.cterm_of ctxt)) - (case rev params of - [] => - (case AList.lookup (op =) (Term.add_frees g []) s of - NONE => error "try_param_tac: no such variable" - | SOME T => - [(#1 (dest_Var P), if ca then concl else lambda (Free (s, T)) concl), - (#1 (dest_Var x), Free (s, T))]) - | (_, T) :: _ => - [(#1 (dest_Var P), - fold_rev Term.abs params - (if ca then concl else incr_boundvars 1 (Abs (s, T, concl)))), - (#1 (dest_Var x), fold_rev Term.abs params (Bound 0))])) rule'; - in compose_tac ctxt (false, rule'', Thm.nprems_of rule) i end); - - -fun extension_definition overloaded name fields alphas zeta moreT more vars thy = - let - val ctxt = Proof_Context.init_global thy; - - val base_name = Long_Name.base_name name; - - val fieldTs = map snd fields; - val fields_moreTs = fieldTs @ [moreT]; - - val alphas_zeta = alphas @ [zeta]; - - val ext_binding = Binding.name (suffix extN base_name); - val ext_name = suffix extN name; - val ext_tyco = suffix ext_typeN name; - val extT = Type (ext_tyco, map TFree alphas_zeta); - val ext_type = fields_moreTs ---> extT; - - - (* the tree of new types that will back the record extension *) - - val mktreeV = Balanced_Tree.make Iso_Tuple_Support.mk_cons_tuple; - - fun mk_iso_tuple (left, right) (thy, i) = - let - val suff = if i = 0 then ext_typeN else inner_typeN ^ string_of_int i; - val ((_, cons), thy') = thy - |> Iso_Tuple_Support.add_iso_tuple_type overloaded - (Binding.suffix_name suff (Binding.name base_name), alphas_zeta) - (fastype_of left, fastype_of right); - in - (cons $ left $ right, (thy', i + 1)) - end; - - (*trying to create a 1-element iso_tuple will fail, and is pointless anyway*) - fun mk_even_iso_tuple [arg] = pair arg - | mk_even_iso_tuple args = mk_iso_tuple (Iso_Tuple_Support.dest_cons_tuple (mktreeV args)); - - fun build_meta_tree_type i thy vars more = - let val len = length vars in - if len < 1 then raise TYPE ("meta_tree_type args too short", [], vars) - else if len > 16 then - let - fun group16 [] = [] - | group16 xs = take 16 xs :: group16 (drop 16 xs); - val vars' = group16 vars; - val (composites, (thy', i')) = fold_map mk_even_iso_tuple vars' (thy, i); - in - build_meta_tree_type i' thy' composites more - end - else - let val (term, (thy', _)) = mk_iso_tuple (mktreeV vars, more) (thy, 0) - in (term, thy') end - end; - - val _ = timing_msg ctxt "record extension preparing definitions"; - - - (* 1st stage part 1: introduce the tree of new types *) - - val (ext_body, typ_thy) = timeit_msg ctxt "record extension nested type def:" (fn () => - build_meta_tree_type 1 thy vars more); - - - (* prepare declarations and definitions *) - - (* 1st stage part 2: define the ext constant *) - - fun mk_ext args = list_comb (Const (ext_name, ext_type), args); - val ext_spec = Logic.mk_equals (mk_ext (vars @ [more]), ext_body); - - val ([ext_def], defs_thy) = timeit_msg ctxt "record extension constructor def:" (fn () => - typ_thy - |> Sign.declare_const_global ((ext_binding, ext_type), NoSyn) |> snd - |> Global_Theory.add_defs false [((Thm.def_binding ext_binding, ext_spec), [])]); - val defs_ctxt = Proof_Context.init_global defs_thy; - - - (* prepare propositions *) - - val _ = timing_msg ctxt "record extension preparing propositions"; - val vars_more = vars @ [more]; - val variants = map (fn Free (x, _) => x) vars_more; - val ext = mk_ext vars_more; - val s = Free (rN, extT); - val P = Free (singleton (Name.variant_list variants) "P", extT --> HOLogic.boolT); - - val inject_prop = (* FIXME local x x' *) - let val vars_more' = map (fn (Free (x, T)) => Free (x ^ "'", T)) vars_more in - HOLogic.mk_conj (HOLogic.eq_const extT $ - mk_ext vars_more $ mk_ext vars_more', \<^term>\True\) - === - foldr1 HOLogic.mk_conj - (map HOLogic.mk_eq (vars_more ~~ vars_more') @ [\<^term>\True\]) - end; - - val induct_prop = (fold_rev Logic.all vars_more (Trueprop (P $ ext)), Trueprop (P $ s)); - - val split_meta_prop = (* FIXME local P *) - let val P = Free (singleton (Name.variant_list variants) "P", extT --> propT) - in Logic.mk_equals (Logic.all s (P $ s), fold_rev Logic.all vars_more (P $ ext)) end; - - val inject = timeit_msg ctxt "record extension inject proof:" (fn () => - simplify (put_simpset HOL_ss defs_ctxt) - (Goal.prove_sorry_global defs_thy [] [] inject_prop - (fn {context = ctxt', ...} => - simp_tac (put_simpset HOL_basic_ss ctxt' addsimps [ext_def]) 1 THEN - REPEAT_DETERM - (resolve_tac ctxt' @{thms refl_conj_eq} 1 ORELSE - Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1 ORELSE - resolve_tac ctxt' [refl] 1)))); - - - (*We need a surjection property r = (| f = f r, g = g r ... |) - to prove other theorems. We haven't given names to the accessors - f, g etc yet however, so we generate an ext structure with - free variables as all arguments and allow the introduction tactic to - operate on it as far as it can. We then use Drule.export_without_context - to convert the free variables into unifiable variables and unify them with - (roughly) the definition of the accessor.*) - val surject = timeit_msg ctxt "record extension surjective proof:" (fn () => - let - val start = - infer_instantiate defs_ctxt [(("y", 0), Thm.cterm_of defs_ctxt ext)] surject_assist_idE; - val tactic1 = - simp_tac (put_simpset HOL_basic_ss defs_ctxt addsimps [ext_def]) 1 THEN - REPEAT_ALL_NEW (Iso_Tuple_Support.iso_tuple_intros_tac defs_ctxt) 1; - val tactic2 = - REPEAT (resolve_tac defs_ctxt [surject_assistI] 1 THEN resolve_tac defs_ctxt [refl] 1); - val [halfway] = Seq.list_of (tactic1 start); - val [surject] = Seq.list_of (tactic2 (Drule.export_without_context halfway)); - in - surject - end); - - val split_meta = timeit_msg ctxt "record extension split_meta proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [] split_meta_prop - (fn {context = ctxt', ...} => - EVERY1 - [resolve_tac ctxt' @{thms equal_intr_rule}, - Goal.norm_hhf_tac ctxt', - eresolve_tac ctxt' @{thms meta_allE}, assume_tac ctxt', - resolve_tac ctxt' [@{thm prop_subst} OF [surject]], - REPEAT o eresolve_tac ctxt' @{thms meta_allE}, assume_tac ctxt'])); - - val induct = timeit_msg ctxt "record extension induct proof:" (fn () => - let val (assm, concl) = induct_prop in - Goal.prove_sorry_global defs_thy [] [assm] concl - (fn {context = ctxt', prems, ...} => - cut_tac (split_meta RS Drule.equal_elim_rule2) 1 THEN - resolve_tac ctxt' prems 2 THEN - asm_simp_tac (put_simpset HOL_ss ctxt') 1) - end); - - val ([(_, [induct']), (_, [inject']), (_, [surjective']), (_, [split_meta'])], thm_thy) = - defs_thy - |> Global_Theory.note_thmss "" - [((Binding.name "ext_induct", []), [([induct], [])]), - ((Binding.name "ext_inject", []), [([inject], [])]), - ((Binding.name "ext_surjective", []), [([surject], [])]), - ((Binding.name "ext_split", []), [([split_meta], [])])]; - in - (((ext_name, ext_type), (ext_tyco, alphas_zeta), - extT, induct', inject', surjective', split_meta', ext_def), thm_thy) - end; - -fun chunks [] [] = [] - | chunks [] xs = [xs] - | chunks (l :: ls) xs = take l xs :: chunks ls (drop l xs); - -fun chop_last [] = error "chop_last: list should not be empty" - | chop_last [x] = ([], x) - | chop_last (x :: xs) = let val (tl, l) = chop_last xs in (x :: tl, l) end; - -fun subst_last _ [] = error "subst_last: list should not be empty" - | subst_last s [_] = [s] - | subst_last s (x :: xs) = x :: subst_last s xs; - - -(* mk_recordT *) - -(*build up the record type from the current extension tpye extT and a list - of parent extensions, starting with the root of the record hierarchy*) -fun mk_recordT extT = - fold_rev (fn (parent, Ts) => fn T => Type (parent, subst_last T Ts)) extT; - - -(* code generation *) - -fun mk_random_eq tyco vs extN Ts = - let - (* FIXME local i etc. *) - val size = \<^term>\i::natural\; - fun termifyT T = HOLogic.mk_prodT (T, \<^typ>\unit \ term\); - val T = Type (tyco, map TFree vs); - val Tm = termifyT T; - val params = Name.invent_names Name.context "x" Ts; - val lhs = HOLogic.mk_random T size; - val tc = HOLogic.mk_return Tm \<^typ>\Random.seed\ - (HOLogic.mk_valtermify_app extN params T); - val rhs = - HOLogic.mk_ST - (map (fn (v, T') => - ((HOLogic.mk_random T' size, \<^typ>\Random.seed\), SOME (v, termifyT T'))) params) - tc \<^typ>\Random.seed\ (SOME Tm, \<^typ>\Random.seed\); - in - (lhs, rhs) - end - -fun mk_full_exhaustive_eq tyco vs extN Ts = - let - (* FIXME local i etc. *) - val size = \<^term>\i::natural\; - fun termifyT T = HOLogic.mk_prodT (T, \<^typ>\unit \ term\); - val T = Type (tyco, map TFree vs); - val test_function = Free ("f", termifyT T --> \<^typ>\(bool \ term list) option\); - val params = Name.invent_names Name.context "x" Ts; - fun mk_full_exhaustive U = \<^Const>\full_exhaustive_class.full_exhaustive U\; - val lhs = mk_full_exhaustive T $ test_function $ size; - val tc = test_function $ (HOLogic.mk_valtermify_app extN params T); - val rhs = fold_rev (fn (v, U) => fn cont => - mk_full_exhaustive U $ (lambda (Free (v, termifyT U)) cont) $ size) params tc; - in - (lhs, rhs) - end; - -fun instantiate_sort_record (sort, mk_eq) tyco vs extN Ts thy = - let - val eq = HOLogic.mk_Trueprop (HOLogic.mk_eq (mk_eq tyco vs extN Ts)); - in - thy - |> Class.instantiation ([tyco], vs, sort) - |> `(fn lthy => Syntax.check_term lthy eq) - |-> (fn eq => Specification.definition NONE [] [] ((Binding.concealed Binding.empty, []), eq)) - |> snd - |> Class.prove_instantiation_exit (fn ctxt => Class.intro_classes_tac ctxt []) - end; - -fun ensure_sort_record (sort, mk_eq) ext_tyco vs extN Ts thy = - let - val algebra = Sign.classes_of thy; - val has_inst = Sorts.has_instance algebra ext_tyco sort; - in - if has_inst then thy - else - (case Quickcheck_Common.perhaps_constrain thy (map (rpair sort) Ts) vs of - SOME constrain => - instantiate_sort_record (sort, mk_eq) ext_tyco (map constrain vs) extN - ((map o map_atyps) (fn TFree v => TFree (constrain v)) Ts) thy - | NONE => thy) - end; - -fun add_code ext_tyco vs extT ext simps inject thy = - if Config.get_global thy codegen then - let - val eq = HOLogic.mk_Trueprop (HOLogic.mk_eq (\<^Const>\HOL.equal extT\, \<^Const>\HOL.eq extT\)); - fun tac ctxt eq_def = - Class.intro_classes_tac ctxt [] - THEN rewrite_goals_tac ctxt [Simpdata.mk_eq eq_def] - THEN ALLGOALS (resolve_tac ctxt @{thms refl}); - fun mk_eq ctxt eq_def = - rewrite_rule ctxt - [Axclass.unoverload ctxt (Thm.symmetric (Simpdata.mk_eq eq_def))] inject; - fun mk_eq_refl ctxt = - \<^instantiate>\'a = \Thm.ctyp_of ctxt (Logic.varifyT_global extT)\ in - lemma (schematic) \equal_class.equal x x \ True\ by (rule equal_refl)\ - |> Axclass.unoverload ctxt; - val ensure_random_record = ensure_sort_record (\<^sort>\random\, mk_random_eq); - val ensure_exhaustive_record = - ensure_sort_record (\<^sort>\full_exhaustive\, mk_full_exhaustive_eq); - fun add_code eq_def thy = - let - val ctxt = Proof_Context.init_global thy; - in - thy - |> Code.declare_default_eqns_global [(mk_eq ctxt eq_def, true), (mk_eq_refl ctxt, false)] - end; - in - thy - |> Code.declare_datatype_global [ext] - |> Code.declare_default_eqns_global (map (rpair true) simps) - |> Class.instantiation ([ext_tyco], vs, [HOLogic.class_equal]) - |> `(fn lthy => Syntax.check_term lthy eq) - |-> (fn eq => Specification.definition NONE [] [] (Binding.empty_atts, eq)) - |-> (fn (_, (_, eq_def)) => - Class.prove_instantiation_exit_result Morphism.thm tac eq_def) - |-> add_code - |> ensure_random_record ext_tyco vs (fst ext) (binder_types (snd ext)) - |> ensure_exhaustive_record ext_tyco vs (fst ext) (binder_types (snd ext)) - end - else thy; - -fun add_ctr_sugar ctr exhaust inject sel_thms = - Ctr_Sugar.default_register_ctr_sugar_global (K true) - {kind = Ctr_Sugar.Record, T = body_type (fastype_of ctr), ctrs = [ctr], casex = Term.dummy, - discs = [], selss = [], exhaust = exhaust, nchotomy = Drule.dummy_thm, injects = [inject], - distincts = [], case_thms = [], case_cong = Drule.dummy_thm, case_cong_weak = Drule.dummy_thm, - case_distribs = [], split = Drule.dummy_thm, split_asm = Drule.dummy_thm, disc_defs = [], - disc_thmss = [], discIs = [], disc_eq_cases = [], sel_defs = [], sel_thmss = [sel_thms], - distinct_discsss = [], exhaust_discs = [], exhaust_sels = [], collapses = [], expands = [], - split_sels = [], split_sel_asms = [], case_eq_ifs = []}; - -fun lhs_of_equation \<^Const_>\Pure.eq _ for t _\ = t - | lhs_of_equation \<^Const_>\Trueprop for \<^Const_>\HOL.eq _ for t _\\ = t; - -fun add_spec_rule rule = - let val head = head_of (lhs_of_equation (Thm.prop_of rule)) - in Spec_Rules.add_global Binding.empty Spec_Rules.equational [head] [rule] end; - -(* definition *) - -fun definition overloaded (alphas, binding) parent (parents: parent_info list) raw_fields thy0 = - let - val ctxt0 = Proof_Context.init_global thy0; - - val prefix = Binding.name_of binding; - val name = Sign.full_name thy0 binding; - val full = Sign.full_name_path thy0 prefix; - - val bfields = map (fn (x, T, _) => (x, T)) raw_fields; - val field_syntax = map #3 raw_fields; - - val parent_fields = maps #fields parents; - val parent_chunks = map (length o #fields) parents; - val parent_names = map fst parent_fields; - val parent_types = map snd parent_fields; - val parent_fields_len = length parent_fields; - val parent_variants = - Name.variant_list [moreN, rN, rN ^ "'", wN] (map Long_Name.base_name parent_names); - val parent_vars = map2 (curry Free) parent_variants parent_types; - val parent_len = length parents; - - val fields = map (apfst full) bfields; - val names = map fst fields; - val types = map snd fields; - val alphas_fields = fold Term.add_tfreesT types []; - val alphas_ext = inter (op =) alphas_fields alphas; - val len = length fields; - val variants = - Name.variant_list (moreN :: rN :: (rN ^ "'") :: wN :: parent_variants) - (map (Binding.name_of o fst) bfields); - val vars = map2 (curry Free) variants types; - val named_vars = names ~~ vars; - val idxms = 0 upto len; - - val all_fields = parent_fields @ fields; - val all_types = parent_types @ types; - val all_variants = parent_variants @ variants; - val all_vars = parent_vars @ vars; - val all_named_vars = (parent_names ~~ parent_vars) @ named_vars; - - val zeta = (singleton (Name.variant_list (map #1 alphas)) "'z", \<^sort>\type\); - val moreT = TFree zeta; - val more = Free (moreN, moreT); - val full_moreN = full (Binding.name moreN); - val bfields_more = bfields @ [(Binding.name moreN, moreT)]; - val fields_more = fields @ [(full_moreN, moreT)]; - val named_vars_more = named_vars @ [(full_moreN, more)]; - val all_vars_more = all_vars @ [more]; - val all_named_vars_more = all_named_vars @ [(full_moreN, more)]; - - - (* 1st stage: ext_thy *) - - val extension_name = full binding; - - val ((ext, (ext_tyco, vs), - extT, ext_induct, ext_inject, ext_surjective, ext_split, ext_def), ext_thy) = - thy0 - |> Sign.qualified_path false binding - |> extension_definition overloaded extension_name fields alphas_ext zeta moreT more vars; - val ext_ctxt = Proof_Context.init_global ext_thy; - - val _ = timing_msg ext_ctxt "record preparing definitions"; - val Type extension_scheme = extT; - val extension_name = unsuffix ext_typeN (fst extension_scheme); - val extension = let val (n, Ts) = extension_scheme in (n, subst_last HOLogic.unitT Ts) end; - val extension_names = map (unsuffix ext_typeN o fst o #extension) parents @ [extension_name]; - val extension_id = implode extension_names; - - fun rec_schemeT n = mk_recordT (map #extension (drop n parents)) extT; - val rec_schemeT0 = rec_schemeT 0; - - fun recT n = - let val (c, Ts) = extension in - mk_recordT (map #extension (drop n parents)) - (Type (c, subst_last HOLogic.unitT Ts)) - end; - val recT0 = recT 0; - - fun mk_rec args n = - let - val (args', more) = chop_last args; - fun mk_ext' ((name, T), args) more = mk_ext (name, T) (args @ [more]); - fun build Ts = - fold_rev mk_ext' (drop n ((extension_names ~~ Ts) ~~ chunks parent_chunks args')) more; - in - if more = HOLogic.unit - then build (map_range recT (parent_len + 1)) - else build (map_range rec_schemeT (parent_len + 1)) - end; - - val r_rec0 = mk_rec all_vars_more 0; - val r_rec_unit0 = mk_rec (all_vars @ [HOLogic.unit]) 0; - - fun r n = Free (rN, rec_schemeT n); - val r0 = r 0; - fun r_unit n = Free (rN, recT n); - val r_unit0 = r_unit 0; - - - (* print translations *) - - val record_ext_type_abbr_tr's = - let - val trname = hd extension_names; - val last_ext = unsuffix ext_typeN (fst extension); - in [record_ext_type_abbr_tr' name alphas zeta last_ext rec_schemeT0 trname] end; - - val record_ext_type_tr's = - let - (*avoid conflict with record_type_abbr_tr's*) - val trnames = if parent_len > 0 then [extension_name] else []; - in map record_ext_type_tr' trnames end; - - val print_translation = - map field_update_tr' (full_moreN :: names) @ [record_ext_tr' extension_name] @ - record_ext_type_tr's @ record_ext_type_abbr_tr's; - - - (* prepare declarations *) - - val sel_decls = map (mk_selC rec_schemeT0 o apfst Binding.name_of) bfields_more; - val upd_decls = map (mk_updC updateN rec_schemeT0 o apfst Binding.name_of) bfields_more; - val make_decl = (makeN, all_types ---> recT0); - val fields_decl = (fields_selN, types ---> Type extension); - val extend_decl = (extendN, recT0 --> moreT --> rec_schemeT0); - val truncate_decl = (truncateN, rec_schemeT0 --> recT0); - - - (* prepare definitions *) - - val ext_defs = ext_def :: map #ext_def parents; - - (*Theorems from the iso_tuple intros. - By unfolding ext_defs from r_rec0 we create a tree of constructor - calls (many of them Pair, but others as well). The introduction - rules for update_accessor_eq_assist can unify two different ways - on these constructors. If we take the complete result sequence of - running a the introduction tactic, we get one theorem for each upd/acc - pair, from which we can derive the bodies of our selector and - updator and their convs.*) - val (accessor_thms, updator_thms, upd_acc_cong_assists) = - timeit_msg ext_ctxt "record getting tree access/updates:" (fn () => - let - val r_rec0_Vars = - let - (*pick variable indices of 1 to avoid possible variable - collisions with existing variables in updacc_eq_triv*) - fun to_Var (Free (c, T)) = Var ((c, 1), T); - in mk_rec (map to_Var all_vars_more) 0 end; - - val init_thm = - infer_instantiate ext_ctxt - [(("v", 0), Thm.cterm_of ext_ctxt r_rec0), - (("v'", 0), Thm.cterm_of ext_ctxt r_rec0_Vars)] - updacc_eq_triv; - val terminal = - resolve_tac ext_ctxt [updacc_eq_idI] 1 THEN resolve_tac ext_ctxt [refl] 1; - val tactic = - simp_tac (put_simpset HOL_basic_ss ext_ctxt addsimps ext_defs) 1 THEN - REPEAT (Iso_Tuple_Support.iso_tuple_intros_tac ext_ctxt 1 ORELSE terminal); - val updaccs = Seq.list_of (tactic init_thm); - in - (updaccs RL [updacc_accessor_eqE], - updaccs RL [updacc_updator_eqE], - updaccs RL [updacc_cong_from_eq]) - end); - - fun lastN xs = drop parent_fields_len xs; - - (*selectors*) - fun mk_sel_spec ((c, T), thm) = - let - val (acc $ arg, _) = - HOLogic.dest_eq (HOLogic.dest_Trueprop (Envir.beta_eta_contract (Thm.concl_of thm))); - val _ = - if arg aconv r_rec0 then () - else raise TERM ("mk_sel_spec: different arg", [arg]); - in - Const (mk_selC rec_schemeT0 (c, T)) :== acc - end; - val sel_specs = map mk_sel_spec (fields_more ~~ lastN accessor_thms); - - (*updates*) - fun mk_upd_spec ((c, T), thm) = - let - val (upd $ _ $ arg, _) = - HOLogic.dest_eq (HOLogic.dest_Trueprop (Envir.beta_eta_contract (Thm.concl_of thm))); - val _ = - if arg aconv r_rec0 then () - else raise TERM ("mk_sel_spec: different arg", [arg]); - in Const (mk_updC updateN rec_schemeT0 (c, T)) :== upd end; - val upd_specs = map mk_upd_spec (fields_more ~~ lastN updator_thms); - - (*derived operations*) - val make_spec = - list_comb (Const (full (Binding.name makeN), all_types ---> recT0), all_vars) :== - mk_rec (all_vars @ [HOLogic.unit]) 0; - val fields_spec = - list_comb (Const (full (Binding.name fields_selN), types ---> Type extension), vars) :== - mk_rec (all_vars @ [HOLogic.unit]) parent_len; - val extend_spec = - Const (full (Binding.name extendN), recT0 --> moreT --> rec_schemeT0) $ r_unit0 $ more :== - mk_rec ((map (mk_sel r_unit0) all_fields) @ [more]) 0; - val truncate_spec = - Const (full (Binding.name truncateN), rec_schemeT0 --> recT0) $ r0 :== - mk_rec ((map (mk_sel r0) all_fields) @ [HOLogic.unit]) 0; - - - (* 2st stage: defs_thy *) - - val (((sel_defs, upd_defs), derived_defs), defs_thy) = - timeit_msg ext_ctxt "record trfuns/tyabbrs/selectors/updates/make/fields/extend/truncate defs:" - (fn () => - ext_thy - |> Sign.print_translation print_translation - |> Sign.restore_naming thy0 - |> Typedecl.abbrev_global (binding, map #1 alphas, NoSyn) recT0 |> snd - |> Typedecl.abbrev_global - (Binding.suffix_name schemeN binding, map #1 (alphas @ [zeta]), NoSyn) rec_schemeT0 - |> snd - |> Sign.qualified_path false binding - |> fold (fn ((x, T), mx) => snd o Sign.declare_const_global ((Binding.name x, T), mx)) - (sel_decls ~~ (field_syntax @ [NoSyn])) - |> fold (fn (x, T) => snd o Sign.declare_const_global ((Binding.name x, T), NoSyn)) - (upd_decls @ [make_decl, fields_decl, extend_decl, truncate_decl]) - |> (Global_Theory.add_defs false o - map (Thm.no_attributes o apfst (Binding.concealed o Binding.name))) sel_specs - ||>> (Global_Theory.add_defs false o - map (Thm.no_attributes o apfst (Binding.concealed o Binding.name))) upd_specs - ||>> (Global_Theory.add_defs false o - map (Thm.no_attributes o apfst (Binding.concealed o Binding.name))) - [make_spec, fields_spec, extend_spec, truncate_spec]); - val defs_ctxt = Proof_Context.init_global defs_thy; - - (* prepare propositions *) - val _ = timing_msg defs_ctxt "record preparing propositions"; - val P = Free (singleton (Name.variant_list all_variants) "P", rec_schemeT0 --> HOLogic.boolT); - val C = Free (singleton (Name.variant_list all_variants) "C", HOLogic.boolT); - val P_unit = Free (singleton (Name.variant_list all_variants) "P", recT0 --> HOLogic.boolT); - - (*selectors*) - val sel_conv_props = - map (fn (c, x as Free (_, T)) => mk_sel r_rec0 (c, T) === x) named_vars_more; - - (*updates*) - fun mk_upd_prop i (c, T) = - let - val x' = - Free (singleton (Name.variant_list all_variants) (Long_Name.base_name c ^ "'"), T --> T); - val n = parent_fields_len + i; - val args' = nth_map n (K (x' $ nth all_vars_more n)) all_vars_more; - in mk_upd updateN c x' r_rec0 === mk_rec args' 0 end; - val upd_conv_props = map2 mk_upd_prop idxms fields_more; - - (*induct*) - val induct_scheme_prop = - fold_rev Logic.all all_vars_more (Trueprop (P $ r_rec0)) ==> Trueprop (P $ r0); - val induct_prop = - (fold_rev Logic.all all_vars (Trueprop (P_unit $ r_rec_unit0)), Trueprop (P_unit $ r_unit0)); - - (*surjective*) - val surjective_prop = - let val args = map (fn (c, Free (_, T)) => mk_sel r0 (c, T)) all_named_vars_more - in r0 === mk_rec args 0 end; - - (*cases*) - val cases_scheme_prop = - (fold_rev Logic.all all_vars_more ((r0 === r_rec0) ==> Trueprop C), Trueprop C); - - val cases_prop = - fold_rev Logic.all all_vars ((r_unit0 === r_rec_unit0) ==> Trueprop C) ==> Trueprop C; - - (*split*) - val split_meta_prop = - let - val P = Free (singleton (Name.variant_list all_variants) "P", rec_schemeT0 --> propT); - in - Logic.mk_equals (Logic.all r0 (P $ r0), fold_rev Logic.all all_vars_more (P $ r_rec0)) - end; - - val split_object_prop = - let val ALL = fold_rev (fn (v, T) => fn t => HOLogic.mk_all (v, T, t)) - in ALL [dest_Free r0] (P $ r0) === ALL (map dest_Free all_vars_more) (P $ r_rec0) end; - - val split_ex_prop = - let val EX = fold_rev (fn (v, T) => fn t => HOLogic.mk_exists (v, T, t)) - in EX [dest_Free r0] (P $ r0) === EX (map dest_Free all_vars_more) (P $ r_rec0) end; - - (*equality*) - val equality_prop = - let - val s' = Free (rN ^ "'", rec_schemeT0); - fun mk_sel_eq (c, Free (_, T)) = mk_sel r0 (c, T) === mk_sel s' (c, T); - val seleqs = map mk_sel_eq all_named_vars_more; - in Logic.all r0 (Logic.all s' (Logic.list_implies (seleqs, r0 === s'))) end; - - - (* 3rd stage: thms_thy *) - - val record_ss = get_simpset defs_thy; - val sel_upd_ss = - simpset_of - (put_simpset record_ss defs_ctxt - addsimps (sel_defs @ accessor_thms @ upd_defs @ updator_thms)); - - val (sel_convs, upd_convs) = - timeit_msg defs_ctxt "record sel_convs/upd_convs proof:" (fn () => - grouped 10 Par_List.map (fn prop => - Goal.prove_sorry_global defs_thy [] [] prop - (fn {context = ctxt', ...} => - ALLGOALS (asm_full_simp_tac (put_simpset sel_upd_ss ctxt')))) - (sel_conv_props @ upd_conv_props)) - |> chop (length sel_conv_props); - - val (fold_congs, unfold_congs) = timeit_msg defs_ctxt "record upd fold/unfold congs:" (fn () => - let - val symdefs = map Thm.symmetric (sel_defs @ upd_defs); - val fold_ctxt = put_simpset HOL_basic_ss defs_ctxt addsimps symdefs; - val ua_congs = map (Drule.export_without_context o simplify fold_ctxt) upd_acc_cong_assists; - in (ua_congs RL [updacc_foldE], ua_congs RL [updacc_unfoldE]) end); - - val parent_induct = Option.map #induct_scheme (try List.last parents); - - val induct_scheme = timeit_msg defs_ctxt "record induct_scheme proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [] induct_scheme_prop - (fn {context = ctxt', ...} => - EVERY - [case parent_induct of NONE => all_tac | SOME ind => try_param_tac ctxt' rN ind 1, - try_param_tac ctxt' rN ext_induct 1, - asm_simp_tac (put_simpset HOL_basic_ss ctxt') 1])); - - val induct = timeit_msg defs_ctxt "record induct proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [#1 induct_prop] (#2 induct_prop) - (fn {context = ctxt', prems, ...} => - try_param_tac ctxt' rN induct_scheme 1 - THEN try_param_tac ctxt' "more" @{thm unit.induct} 1 - THEN resolve_tac ctxt' prems 1)); - - val surjective = timeit_msg defs_ctxt "record surjective proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [] surjective_prop - (fn {context = ctxt', ...} => - EVERY - [resolve_tac ctxt' [surject_assist_idE] 1, - simp_tac (put_simpset HOL_basic_ss ctxt' addsimps ext_defs) 1, - REPEAT - (Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1 ORELSE - (resolve_tac ctxt' [surject_assistI] 1 THEN - simp_tac (put_simpset (get_sel_upd_defs defs_thy) ctxt' - addsimps (sel_defs @ @{thms o_assoc id_apply id_o o_id})) 1))])); - - val cases_scheme = timeit_msg defs_ctxt "record cases_scheme proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [#1 cases_scheme_prop] (#2 cases_scheme_prop) - (fn {context = ctxt', prems, ...} => - resolve_tac ctxt' prems 1 THEN - resolve_tac ctxt' [surjective] 1)); - - val cases = timeit_msg defs_ctxt "record cases proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [] cases_prop - (fn {context = ctxt', ...} => - try_param_tac ctxt' rN cases_scheme 1 THEN - ALLGOALS (asm_full_simp_tac - (put_simpset HOL_basic_ss ctxt' addsimps @{thms unit_all_eq1})))); - - val split_meta = timeit_msg defs_ctxt "record split_meta proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [] split_meta_prop - (fn {context = ctxt', ...} => - EVERY1 - [resolve_tac ctxt' @{thms equal_intr_rule}, Goal.norm_hhf_tac ctxt', - eresolve_tac ctxt' @{thms meta_allE}, assume_tac ctxt', - resolve_tac ctxt' [@{thm prop_subst} OF [surjective]], - REPEAT o eresolve_tac ctxt' @{thms meta_allE}, assume_tac ctxt'])); - - val split_object = timeit_msg defs_ctxt "record split_object proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [] split_object_prop - (fn {context = ctxt', ...} => - resolve_tac ctxt' [@{lemma "Trueprop A \ Trueprop B \ A = B" by (rule iffI) unfold}] 1 THEN - rewrite_goals_tac ctxt' @{thms atomize_all [symmetric]} THEN - resolve_tac ctxt' [split_meta] 1)); - - val split_ex = timeit_msg defs_ctxt "record split_ex proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [] split_ex_prop - (fn {context = ctxt', ...} => - simp_tac - (put_simpset HOL_basic_ss ctxt' addsimps - (@{lemma "\x. P x \ \ (\x. \ P x)" by simp} :: - @{thms not_not Not_eq_iff})) 1 THEN - resolve_tac ctxt' [split_object] 1)); - - val equality = timeit_msg defs_ctxt "record equality proof:" (fn () => - Goal.prove_sorry_global defs_thy [] [] equality_prop - (fn {context = ctxt', ...} => - asm_full_simp_tac (put_simpset record_ss ctxt' addsimps (split_meta :: sel_convs)) 1)); - - val ([(_, sel_convs'), (_, upd_convs'), (_, sel_defs'), (_, upd_defs'), - (_, fold_congs'), (_, unfold_congs'), - (_, splits' as [split_meta', split_object', split_ex']), (_, derived_defs'), - (_, [surjective']), (_, [equality']), (_, [induct_scheme']), (_, [induct']), - (_, [cases_scheme']), (_, [cases'])], thms_thy) = defs_thy - |> Code.declare_default_eqns_global (map (rpair true) derived_defs) - |> Global_Theory.note_thmss "" - [((Binding.name "select_convs", []), [(sel_convs, [])]), - ((Binding.name "update_convs", []), [(upd_convs, [])]), - ((Binding.name "select_defs", []), [(sel_defs, [])]), - ((Binding.name "update_defs", []), [(upd_defs, [])]), - ((Binding.name "fold_congs", []), [(fold_congs, [])]), - ((Binding.name "unfold_congs", []), [(unfold_congs, [])]), - ((Binding.name "splits", []), [([split_meta, split_object, split_ex], [])]), - ((Binding.name "defs", []), [(derived_defs, [])]), - ((Binding.name "surjective", []), [([surjective], [])]), - ((Binding.name "equality", []), [([equality], [])]), - ((Binding.name "induct_scheme", induct_type_global (suffix schemeN name)), - [([induct_scheme], [])]), - ((Binding.name "induct", induct_type_global name), [([induct], [])]), - ((Binding.name "cases_scheme", cases_type_global (suffix schemeN name)), - [([cases_scheme], [])]), - ((Binding.name "cases", cases_type_global name), [([cases], [])])]; - - val sel_upd_simps = sel_convs' @ upd_convs'; - val sel_upd_defs = sel_defs' @ upd_defs'; - val depth = parent_len + 1; - - val ([(_, simps'), (_, iffs')], thms_thy') = thms_thy - |> Global_Theory.note_thmss "" - [((Binding.name "simps", [Simplifier.simp_add]), [(sel_upd_simps, [])]), - ((Binding.name "iffs", [iff_add]), [([ext_inject], [])])]; - - val info = - make_info alphas parent fields extension - ext_induct ext_inject ext_surjective ext_split ext_def - sel_convs' upd_convs' sel_defs' upd_defs' fold_congs' unfold_congs' splits' derived_defs' - surjective' equality' induct_scheme' induct' cases_scheme' cases' simps' iffs'; - - val final_thy = - thms_thy' - |> put_record name info - |> put_sel_upd names full_moreN depth sel_upd_simps sel_upd_defs - |> add_equalities extension_id equality' - |> add_extinjects ext_inject - |> add_extsplit extension_name ext_split - |> add_splits extension_id (split_meta', split_object', split_ex', induct_scheme') - |> add_extfields extension_name (fields @ [(full_moreN, moreT)]) - |> add_fieldext (extension_name, snd extension) names - |> add_code ext_tyco vs extT ext simps' ext_inject - |> add_ctr_sugar (Const ext) cases_scheme' ext_inject sel_convs' - |> fold add_spec_rule (sel_convs' @ upd_convs' @ derived_defs') - |> Sign.restore_naming thy0; - - in final_thy end; - - -(* add_record *) - -local - -fun read_parent NONE ctxt = (NONE, ctxt) - | read_parent (SOME raw_T) ctxt = - (case Proof_Context.read_typ_abbrev ctxt raw_T of - Type (name, Ts) => (SOME (Ts, name), fold Variable.declare_typ Ts ctxt) - | T => error ("Bad parent record specification: " ^ Syntax.string_of_typ ctxt T)); - -fun read_fields raw_fields ctxt = - let - val Ts = Syntax.read_typs ctxt (map (fn (_, raw_T, _) => raw_T) raw_fields); - val fields = map2 (fn (x, _, mx) => fn T => (x, T, mx)) raw_fields Ts; - val ctxt' = fold Variable.declare_typ Ts ctxt; - in (fields, ctxt') end; - -in - -fun add_record overloaded (params, binding) raw_parent raw_fields thy = - let - val ctxt = Proof_Context.init_global thy; - fun cert_typ T = Type.no_tvars (Proof_Context.cert_typ ctxt T) - handle TYPE (msg, _, _) => error msg; - - - (* specification *) - - val parent = Option.map (apfst (map cert_typ)) raw_parent - handle ERROR msg => - cat_error msg ("The error(s) above occurred in parent record specification"); - val parent_args = (case parent of SOME (Ts, _) => Ts | NONE => []); - val parents = get_parent_info thy parent; - - val bfields = - raw_fields |> map (fn (x, raw_T, mx) => (x, cert_typ raw_T, mx) - handle ERROR msg => - cat_error msg ("The error(s) above occurred in record field " ^ Binding.print x)); - - - (* errors *) - - val name = Sign.full_name thy binding; - val err_dup_record = - if is_none (get_info thy name) then [] - else ["Duplicate definition of record " ^ quote name]; - - val spec_frees = fold Term.add_tfreesT (parent_args @ map #2 bfields) []; - val err_extra_frees = - (case subtract (op =) params spec_frees of - [] => [] - | extras => ["Extra free type variable(s) " ^ - commas (map (Syntax.string_of_typ ctxt o TFree) extras)]); - - val err_no_fields = if null bfields then ["No fields present"] else []; - - val err_dup_fields = - (case duplicates Binding.eq_name (map #1 bfields) of - [] => [] - | dups => ["Duplicate field(s) " ^ commas (map Binding.print dups)]); - - val err_bad_fields = - if forall (not_equal moreN o Binding.name_of o #1) bfields then [] - else ["Illegal field name " ^ quote moreN]; - - val errs = - err_dup_record @ err_extra_frees @ err_no_fields @ err_dup_fields @ err_bad_fields; - val _ = if null errs then () else error (cat_lines errs); - in - thy |> definition overloaded (params, binding) parent parents bfields - end - handle ERROR msg => cat_error msg ("Failed to define record " ^ Binding.print binding); - -fun add_record_cmd overloaded (raw_params, binding) raw_parent raw_fields thy = - let - val ctxt = Proof_Context.init_global thy; - val params = map (apsnd (Typedecl.read_constraint ctxt)) raw_params; - val ctxt1 = fold (Variable.declare_typ o TFree) params ctxt; - val (parent, ctxt2) = read_parent raw_parent ctxt1; - val (fields, ctxt3) = read_fields raw_fields ctxt2; - val params' = map (Proof_Context.check_tfree ctxt3) params; - in thy |> add_record overloaded (params', binding) parent fields end; - -end; - - -(* printing *) - -local - -fun the_parent_recT (Type (parent, [Type (_, [unit as Type (_,[])])])) = Type (parent, [unit]) - | the_parent_recT (Type (extT, [T])) = Type (extT, [the_parent_recT T]) - | the_parent_recT T = raise TYPE ("Not a unit record scheme with parent: ", [T], []) - -in - -fun pretty_recT ctxt typ = - let - val thy = Proof_Context.theory_of ctxt - val (fs, (_, moreT)) = get_recT_fields thy typ - val _ = if moreT = HOLogic.unitT then () else raise TYPE ("Not a unit record scheme: ", [typ], []) - val parent = if length (dest_recTs typ) >= 2 then SOME (the_parent_recT typ) else NONE - val pfs = case parent of SOME p => fst (get_recT_fields thy p) | NONE => [] - val fs' = drop (length pfs) fs - fun pretty_field (name, typ) = Pretty.block [ - Syntax.pretty_term ctxt (Const (name, typ)), - Pretty.brk 1, - Pretty.str "::", - Pretty.brk 1, - Syntax.pretty_typ ctxt typ - ] - in - Pretty.block (Library.separate (Pretty.brk 1) - ([Pretty.keyword1 "record", Syntax.pretty_typ ctxt typ, Pretty.str "="] @ - (case parent of - SOME p => [Syntax.pretty_typ ctxt p, Pretty.str "+"] - | NONE => [])) @ - Pretty.fbrk :: - Pretty.fbreaks (map pretty_field fs')) - end - -end - -fun string_of_record ctxt s = - let - val T = Syntax.read_typ ctxt s - in - Pretty.string_of (pretty_recT ctxt T) - handle TYPE _ => error ("Unknown record: " ^ Syntax.string_of_typ ctxt T) - end - -val print_record = - let - fun print_item string_of (modes, arg) = Toplevel.keep (fn state => - Print_Mode.with_modes modes (fn () => Output.writeln (string_of state arg)) ()); - in print_item (string_of_record o Toplevel.context_of) end - - -(* outer syntax *) - -val _ = - Outer_Syntax.command \<^command_keyword>\record*\ "define extensible record" - (Parse_Spec.overloaded -- (Parse.type_args_constrained -- Parse.binding) -- - (\<^keyword>\=\ |-- Scan.option (Parse.typ --| \<^keyword>\+\) -- - Scan.repeat1 Parse.const_binding) - >> (fn ((overloaded, x), (y, z)) => - Toplevel.theory (add_record_cmd {overloaded = overloaded} x y z))); - -val opt_modes = - Scan.optional (\<^keyword>\(\ |-- Parse.!!! (Scan.repeat1 Parse.name --| \<^keyword>\)\)) [] - -val _ = - Outer_Syntax.command \<^command_keyword>\print_record*\ "print record definiton" - (opt_modes -- Parse.typ >> print_record); - -end diff --git a/Isabelle_DOF/ontologies/scholarly_paper/scholarly_paper.thy b/Isabelle_DOF/ontologies/scholarly_paper/scholarly_paper.thy index 732200f..c62d28b 100644 --- a/Isabelle_DOF/ontologies/scholarly_paper/scholarly_paper.thy +++ b/Isabelle_DOF/ontologies/scholarly_paper/scholarly_paper.thy @@ -165,11 +165,11 @@ text\The intended use for the \doc_class\es \<^verbatim>\\math_example\ (or \<^verbatim>\math_ex\ for short) are \<^emph>\informal\ descriptions of semi-formal definitions (by inheritance). Math-Examples can be made referentiable triggering explicit, numbered presentations.\ -doc_class math_motivation = tc + +doc_class math_motivation = technical + referentiable :: bool <= False type_synonym math_mtv = math_motivation -doc_class math_explanation = tc + +doc_class math_explanation = technical + referentiable :: bool <= False type_synonym math_exp = math_explanation @@ -207,7 +207,7 @@ datatype math_content_class = text\Instances of the \doc_class\ \<^verbatim>\math_content\ are by definition @{term "semiformal"}; they may be non-referential, but in this case they will not have a @{term "short_name"}.\ -doc_class math_content = tc + +doc_class math_content = technical + referentiable :: bool <= False short_name :: string <= "''''" status :: status <= "semiformal" @@ -516,34 +516,34 @@ subsection\Content in Engineering/Tech Papers \ text\This section is currently experimental and not supported by the documentation generation backend.\ -doc_class engineering_content = tc + +doc_class engineering_content = technical + short_name :: string <= "''''" status :: status type_synonym eng_content = engineering_content -doc_class "experiment" = eng_content + +doc_class "experiment" = engineering_content + tag :: "string" <= "''''" -doc_class "evaluation" = eng_content + +doc_class "evaluation" = engineering_content + tag :: "string" <= "''''" -doc_class "data" = eng_content + +doc_class "data" = engineering_content + tag :: "string" <= "''''" -doc_class tech_definition = eng_content + +doc_class tech_definition = engineering_content + referentiable :: bool <= True tag :: "string" <= "''''" -doc_class tech_code = eng_content + +doc_class tech_code = engineering_content + referentiable :: bool <= True tag :: "string" <= "''''" -doc_class tech_example = eng_content + +doc_class tech_example = engineering_content + referentiable :: bool <= True tag :: "string" <= "''''" -doc_class eng_example = eng_content + +doc_class eng_example = engineering_content + referentiable :: bool <= True tag :: "string" <= "''''" diff --git a/Isabelle_DOF/thys/Isa_COL.thy b/Isabelle_DOF/thys/Isa_COL.thy index 6d4aed8..a5e007f 100644 --- a/Isabelle_DOF/thys/Isa_COL.thy +++ b/Isabelle_DOF/thys/Isa_COL.thy @@ -376,7 +376,9 @@ fun fig_content_antiquotation name scan = fun figure_content ctxt (cfg_trans,file:Input.source) = - let val (wdth_val_s, ht_s, caption) = process_args cfg_trans + let val _ = Resources.check_file ctxt (SOME (get_document_dir ctxt)) file + (* ToDo: must be declared source of type png or jpeg or pdf, ... *) + val (wdth_val_s, ht_s, caption) = process_args cfg_trans val args = ["keepaspectratio","width=" ^ wdth_val_s, ht_s] |> commas |> enclose "[" "]" @@ -427,11 +429,11 @@ fun convert_src_from_margs ctxt (X, (((str,_),value)::R)) = fun float_command (name, pos) descr cid = let fun set_default_class NONE = SOME(cid,pos) |set_default_class (SOME X) = SOME X - fun create_instance ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) = + fun create_instance (((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) = Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline = true} - {define = true} oid pos (set_default_class cid_pos) doc_attrs + {define = true} binding (set_default_class cid_pos) doc_attrs fun generate_fig_ltx_ctxt ctxt cap_src oid body = Latex.macro0 "centering" @ body @@ -439,25 +441,31 @@ fun float_command (name, pos) descr cid = @ Latex.macro "label" (DOF_core.get_instance_name_global oid (Proof_Context.theory_of ctxt) |> DOF_core.output_name |> Latex.string) - fun parse_and_tex (margs as (((oid, _),_), _), cap_src) ctxt = - (convert_src_from_margs ctxt margs) - |> pair (upd_caption (K Input.empty) #> convert_meta_args ctxt margs) - |> fig_content ctxt - |> generate_fig_ltx_ctxt ctxt cap_src oid - |> (Latex.environment ("figure") ) + fun parse_and_tex (margs as ((binding,_), _), cap_src) ctxt = + let val oid = Binding.name_of binding + in + (convert_src_from_margs ctxt margs) + |> pair (upd_caption (K Input.empty) #> convert_meta_args ctxt margs) + |> fig_content ctxt + |> generate_fig_ltx_ctxt ctxt cap_src oid + |> (Latex.environment ("figure") ) + end in Monitor_Command_Parser.onto_macro_cmd_command (name, pos) descr create_instance parse_and_tex end fun listing_command (name, pos) descr cid = let fun set_default_class NONE = SOME(cid,pos) |set_default_class (SOME X) = SOME X - fun create_instance ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) = + fun create_instance (((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) = Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline = true} - {define = true} oid pos (set_default_class cid_pos) doc_attrs - fun parse_and_tex (margs as (((_, pos),_), _), _) _ = - ISA_core.err ("Not yet implemented.\n Please use text*[oid::listing]\\ instead.") pos + {define = true} binding (set_default_class cid_pos) doc_attrs + fun parse_and_tex (margs as ((binding,_), _), _) _ = + let val pos = Binding.pos_of binding + in + ISA_core.err ("Not yet implemented.\n Please use text*[oid::listing]\\ instead.") pos + end in Monitor_Command_Parser.onto_macro_cmd_command (name, pos) descr create_instance parse_and_tex end @@ -752,7 +760,7 @@ text\ @{table_inline [display] (cell_placing = center,cell_height =\*) -text\beamer frame environment\ +text\beamer support\ (* Under development *) doc_class frame = @@ -784,6 +792,18 @@ fun upd_frametitle f = fun upd_framesubtitle f = upd_frame (fn (options, frametitle, framesubtitle) => (options, frametitle, f framesubtitle)) +type block = {title: Input.source} + +val empty_block = {title = Input.empty} + +fun make_block title = {title = title} + +fun upd_block f = + fn {title} => make_block (f title) + +fun upd_block_title f = + upd_block (fn title => f title) + val unenclose_end = unenclose val unenclose_string = unenclose o unenclose o unenclose_end @@ -794,6 +814,42 @@ fun read_string s = else unenclose_string s |> Syntax.read_input end +val block_titleN = "title" + +fun block_modes (ctxt, toks) = + let val (y, toks') = ((((Scan.optional + (Args.parens + (Parse.list1 + ((Args.$$$ block_titleN |-- Args.$$$ "=" -- Parse.document_source + >> (fn (_, k) => upd_block_title (K k))) + ))) [K empty_block]) + : (block -> block) list parser) + >> (foldl1 (op #>))) + : (block -> block) parser) + (toks) + in (y, (ctxt, toks')) end + +fun process_args cfg_trans = + let val {title} = cfg_trans empty_block + in title end + +fun block ctxt (cfg_trans,src) = + let val title = process_args cfg_trans + in Latex.string "{" + @ (title |> Document_Output.output_document ctxt {markdown = false}) + @ Latex.string "}" + @ (src |> Document_Output.output_document ctxt {markdown = false}) + |> (Latex.environment "block") + end + +fun block_antiquotation name scan = + (Document_Output.antiquotation_raw_embedded name + (scan : ((block -> block) * Input.source) context_parser) + (block: Proof.context -> (block -> block) * Input.source -> Latex.text)); + +val _ = block_antiquotation \<^binding>\block\ (block_modes -- Scan.lift Parse.document_source) + |> Theory.setup + fun convert_meta_args ctxt (X, (((str,_),value) :: R)) = (case YXML.content_of str of "frametitle" => upd_frametitle (K(YXML.content_of value |> read_string)) @@ -808,18 +864,19 @@ fun convert_meta_args ctxt (X, (((str,_),value) :: R)) = fun frame_command (name, pos) descr cid = let fun set_default_class NONE = SOME(cid,pos) |set_default_class (SOME X) = SOME X - fun create_instance ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) = + fun create_instance (((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) = Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline = true} - {define = true} oid pos (set_default_class cid_pos) doc_attrs - fun titles_src ctxt frametitle framesubtitle src = Latex.string "{" - @ Document_Output.output_document ctxt {markdown = false} frametitle - @ Latex.string "}" - @ Latex.string "{" - @ (Document_Output.output_document ctxt {markdown = false} framesubtitle) - @ Latex.string "}" - @ Document_Output.output_document ctxt {markdown = true} src + {define = true} binding (set_default_class cid_pos) doc_attrs + fun titles_src ctxt frametitle framesubtitle src = + Latex.string "{" + @ Document_Output.output_document ctxt {markdown = false} frametitle + @ Latex.string "}" + @ Latex.string "{" + @ (Document_Output.output_document ctxt {markdown = false} framesubtitle) + @ Latex.string "}" + @ Document_Output.output_document ctxt {markdown = true} src fun generate_src_ltx_ctxt ctxt src cfg_trans = let val {options, frametitle, framesubtitle} = cfg_trans empty_frame in diff --git a/Isabelle_DOF/thys/Isa_DOF.thy b/Isabelle_DOF/thys/Isa_DOF.thy index d2c0a59..699726e 100644 --- a/Isabelle_DOF/thys/Isa_DOF.thy +++ b/Isabelle_DOF/thys/Isa_DOF.thy @@ -93,7 +93,7 @@ structure DOF_core = struct datatype onto_class = Onto_Class of - {params : (string * sort) list, (*currently not used *) + {params : (string * sort) list, name : binding, virtual : {virtual : bool}, inherits_from : (typ list * string) option, (* imports *) @@ -115,33 +115,69 @@ struct val get_onto_classes = Onto_Classes.get o Proof_Context.theory_of + (* Get the Onto_Class object from its short name or long name *) fun get_onto_class_global name thy = Name_Space.check (Context.Theory thy) (get_onto_classes (Proof_Context.init_global thy)) (name, Position.none) |> #2 - fun resolve_syn thy name = - name |> Syntax.read_typ_global thy - |> Syntax.string_of_typ_global thy - |> YXML.parse_body |> XML.content_of + fun markup2string s = YXML.content_of s + |> Symbol.explode + |> List.filter (fn c => c <> Symbol.DEL) + |> String.concat - (* takes class synonyms into account *) + fun get_name name = + name |> markup2string + |> Symbol.explode_words |> split_last |> #2 + + (* Get the Onto_Class object. + The function does not use the abstract syntax of an onto_class. + and relies entirely on the associated record created with the onto_class and its type. + The function takes, for the name argument, a string that has the same form as + the input of an ML \<^typ>\\ anti-quotation to get a record type. + To get the type of the record ('a, 'b) A, we'll use \<^typ>\('a, 'b) A\. + So the name argument will be "('a, 'b) A" *) + (* does not take class synonyms into account *) fun get_onto_class_global' name thy = - let val name' = name |> resolve_syn thy + let val name' = name |> get_name in Name_Space.check (Context.Theory thy) (get_onto_classes (Proof_Context.init_global thy)) (name', Position.none) |> #2 end + (* Get the full-qualified name of an onto_class from its short name or long name *) fun get_onto_class_name_global name thy = Name_Space.check (Context.Theory thy) (get_onto_classes (Proof_Context.init_global thy)) (name, Position.none) |> #1 - (* takes class synonyms into account *) + (* Get the full-qualified name of an onto-class. + The function does not use the abstract syntax of an onto_class. + and relies entirely on the associated record created with the onto_class and its type. + The function takes, for the name argument, a string that has the same form as + the input of an ML \<^typ>\\ anti-quotation to get a record type. + To get the type of the record ('a, 'b) A, we'll use \<^typ>\('a, 'b) A\. + So the name argument will be "('a, 'b) A". *) + (* does not take class synonyms into account *) fun get_onto_class_name_global' name thy = - let val name' = name |> resolve_syn thy + let val name' = name |> get_name in Name_Space.check (Context.Theory thy) (get_onto_classes (Proof_Context.init_global thy)) (name', Position.none) |> #1 end + (* Get the full-qualified name of an onto-class, + its string representation with arguments and sorts, and its type + from its short or long name. + To get the type of the record ('a::one, 'b) A, we'll use \<^typ>\('a::one, 'b) A\. + So the name argument will be "('a::one, 'b) A" *) + (* does not takes class synonyms into account *) + fun get_onto_class_cid thy name = + let val cid_typ = name |> Syntax.read_typ_global thy + val (args, name') = name |> Symbol.explode_words |> split_last + val name'' = (name', Position.none) |> Name_Space.check (Context.Theory thy) + (get_onto_classes (Proof_Context.init_global thy)) + |> fst + val args_cid = append args [name''] |> space_implode Symbol.space + in ((name'', args_cid), cid_typ) + end + fun check_onto_class ctxt = #1 o Name_Space.check (Context.Proof ctxt) (get_onto_classes ctxt); @@ -198,7 +234,7 @@ struct rejectS , rex, invs) => (params, name, virtual, inherits_from, attribute_decl, rejectS , rex, f invs)) - fun rep_onto_class cid thy = get_onto_class_global' cid thy |> (fn Onto_Class rep => rep) + fun rep_onto_class cid thy = get_onto_class_global cid thy |> (fn Onto_Class rep => rep) fun params_of cid = #params o rep_onto_class cid fun name_of cid = #name o rep_onto_class cid @@ -228,16 +264,18 @@ struct val default_cid = "text" (* the top (default) document class: everything is a text.*) +(* Due to the implementation, is_subclass0 is _NOT_ compatible with DOF classes type synonyms, + so they must be resolved to full class names before use. *) fun is_subclass0 s t ctxt = let fun get id = if id = default_cid then default_cid - else check_onto_class ctxt (id, Position.none) + else get_onto_class_name_global id (Proof_Context.theory_of ctxt) val s' = get s val t' = get t fun father_is_sub s = case get_onto_class_global s (Proof_Context.theory_of ctxt) of - (Onto_Class {inherits_from=NONE, ...}) => s' = t' - | (Onto_Class {inherits_from=SOME (_,s''), ...}) => - s'' = t' orelse father_is_sub s'' + (Onto_Class {inherits_from=NONE, ...}) => s = t' + | (Onto_Class {inherits_from=SOME (_, s''), ...}) => + s'' = t' orelse father_is_sub s'' val s'_defined = s' |> Name_Space.declared (get_onto_classes ctxt |> Name_Space.space_of_table) in s' = t' orelse @@ -694,9 +732,7 @@ fun check_reject_atom term = fun define_doc_class_global (params', binding) parent fields rexp reject_Atoms invs virtual thy = (* This operation is executed in a context where the record has already been defined, but its conversion into a class is not yet done. *) - let (* takes class synonyms into account *) - val parent' = Option.map (apsnd (fn x => get_onto_class_name_global' x thy)) parent - val rejectS = map (Syntax.read_term_global thy) reject_Atoms; + let val rejectS = map (Syntax.read_term_global thy) reject_Atoms; val _ = map (check_reject_atom) rejectS; val reg_exps = map (Syntax.read_term_global thy) rexp; val _ = map check_regexps reg_exps @@ -707,15 +743,14 @@ fun define_doc_class_global (params', binding) parent fields rexp reject_Atoms i |> (fn pos => error("invariant labels must be unique"^ Position.here pos)) else () val invs' = map (apsnd(Syntax.read_term_global thy)) invs - in thy |> add_onto_class binding (make_onto_class (params', binding, virtual, parent', fields + in thy |> add_onto_class binding (make_onto_class (params', binding, virtual, parent, fields , rejectS, reg_exps, invs')) end -fun define_object_global {define = define} ((oid, pos), instance) thy = +fun define_object_global {define = define} (binding, instance) thy = let - val binding = if Long_Name.is_qualified oid - then Binding.make (Long_Name.base_name oid, pos) - else Binding.make (oid, pos) + val oid = Binding.name_of binding + val pos = Binding.pos_of binding val _ = if oid = undefined_instance then error (oid ^ ": This name is reserved by the implementation" ^ Position.here pos) else () @@ -739,17 +774,15 @@ fun define_object_global {define = define} ((oid, pos), instance) thy = add_instance binding instance thy end - fun get_attributes_local cid ctxt = if cid = default_cid then [] else let val cid_long = get_onto_class_name_global cid (Proof_Context.theory_of ctxt) in - case get_onto_class_global cid (Proof_Context.theory_of ctxt) of + case get_onto_class_global cid_long (Proof_Context.theory_of ctxt) of Onto_Class {inherits_from=NONE, attribute_decl = X, ...} => [(cid_long,X)] - | Onto_Class {inherits_from=SOME(_,father), - attribute_decl = X, ...} => - get_attributes_local father ctxt @ [(cid_long,X)] + | Onto_Class {inherits_from=SOME(_, father), attribute_decl = X, ...} => + get_attributes_local father ctxt @ [(cid_long,X)] end fun get_attributes cid thy = get_attributes_local cid (Proof_Context.init_global thy) @@ -780,8 +813,8 @@ fun get_attribute_info_local (*long*)cid attr ctxt : attributes_info option= typ = ty}) end -fun get_attribute_info (*long*)cid attr thy = - get_attribute_info_local cid attr (Proof_Context.init_global thy) +fun get_attribute_info (*long*)cid attr thy = + get_attribute_info_local cid attr (Proof_Context.init_global thy) fun get_attribute_defaults (* long*)cid thy = let val attrS = flat(map snd (get_attributes cid thy)) @@ -870,6 +903,25 @@ fun transduce_term_global {mk_elaboration=mk_elaboration} (term,pos) thy = | T t = t in T term end +(* Elaborate an Isabelle_DOF term-antiquotation from an only parsed-term (not checked) *) +fun parsed_elaborate ctxt pos (Const(s,ty) $ t) = + if is_ISA s + then Syntax.check_term ctxt (Const(s, ty) $ t) + |> (fn t => transduce_term_global {mk_elaboration=true} (t , pos) + (Proof_Context.theory_of ctxt)) + else (Const(s,ty) $ (parsed_elaborate ctxt pos t)) + | parsed_elaborate ctxt pos (t1 $ t2) = parsed_elaborate ctxt pos (t1) $ parsed_elaborate ctxt pos (t2) + | parsed_elaborate ctxt pos (Const(s,ty)) = + if is_ISA s + then Syntax.check_term ctxt (Const(s, ty)) + |> (fn t => transduce_term_global {mk_elaboration=true} (t , pos) + (Proof_Context.theory_of ctxt)) + else Const(s,ty) + | parsed_elaborate ctxt pos (Abs(s,ty,t)) = Abs (s,ty,parsed_elaborate ctxt pos t) + | parsed_elaborate _ _ t = t + +fun elaborate_term' ctxt parsed_term = parsed_elaborate ctxt Position.none parsed_term + fun elaborate_term ctxt term = transduce_term_global {mk_elaboration=true} (term , Position.none) (Proof_Context.theory_of ctxt) @@ -878,6 +930,39 @@ fun check_term ctxt term = transduce_term_global {mk_elaboration=false} (term , Position.none) (Proof_Context.theory_of ctxt) +(* Check an Isabelle_DOF term-antiquotation from an only parsed-term (not checked) *) +fun parsed_check ctxt pos (Const(s,ty) $ t) = + if is_ISA s + then let val _ = Syntax.check_term ctxt (Const(s, ty) $ t) + |> (fn t => transduce_term_global {mk_elaboration=false} (t , pos) + (Proof_Context.theory_of ctxt)) + in (Const(s,ty) $ (parsed_check ctxt pos t)) end + else (Const(s,ty) $ (parsed_check ctxt pos t)) + | parsed_check ctxt pos (t1 $ t2) = parsed_check ctxt pos (t1) $ parsed_check ctxt pos (t2) + | parsed_check ctxt pos (Const(s,ty)) = + if is_ISA s + then let val _ = Syntax.check_term ctxt (Const(s, ty)) + |> (fn t => transduce_term_global {mk_elaboration=false} (t , pos) + (Proof_Context.theory_of ctxt)) + in Const(s,ty) end + else Const(s,ty) + | parsed_check ctxt pos (Abs(s,ty,t)) = Abs (s,ty,parsed_check ctxt pos t) + | parsed_check _ _ t = t + +fun check_term' ctxt parsed_term = parsed_check ctxt Position.none parsed_term + +fun prep_decls prep_var raw_vars ctxt = + let + val (vars, ctxt') = fold_map prep_var raw_vars ctxt; + val (xs, ctxt'') = ctxt' + |> Context_Position.set_visible false + |> Proof_Context.add_fixes vars + ||> Context_Position.restore_visible ctxt'; + val _ = + Context_Position.reports ctxt'' + (map (Binding.pos_of o #1) vars ~~ + map (Variable.markup_entity_def ctxt'' ##> Properties.remove Markup.kindN) xs); + in ((vars, xs), ctxt'') end; fun print_doc_class_tree ctxt P T = let val classes = Name_Space.dest_table (get_onto_classes ctxt) @@ -1104,12 +1189,13 @@ fun check_instance thy (term, _, pos) s = |> Long_Name.qualify qual fun check thy (name, _) = let - val object_cid = DOF_core.cid_of name thy + val name' = DOF_core.cid_of name thy + |> DOF_core.get_onto_class_cid thy |> (fst o fst) fun check' (class_name, object_cid) = if class_name = object_cid then DOF_core.value_of name thy else err (name ^ " is not an instance of " ^ class_name) pos - in check' (class_name, object_cid) end; + in check' (class_name, name') end; in ML_isa_check_generic check thy (term, pos) end fun ML_isa_id _ (term,_) = SOME term @@ -1152,12 +1238,11 @@ fun rm_mixfix name mixfix thy = |> Named_Target.theory_map) end -fun elaborate_instance thy _ _ term_option pos = +fun elaborate_instance thy _ _ term_option _ = case term_option of NONE => error ("Malformed term annotation") | SOME term => let val instance_name = HOLogic.dest_string term - val value = DOF_core.value_of instance_name thy - in DOF_core.transduce_term_global {mk_elaboration=true} (value, pos) thy + in DOF_core.value_of instance_name thy end (* @@ -1165,14 +1250,15 @@ fun elaborate_instance thy _ _ term_option pos = because the class name is already bound to "doc_class Regular_Exp.rexp" constant by add_doc_class_cmd function *) -fun declare_ISA_class_accessor_and_check_instance (doc_class_name, bind_pos) thy = +fun declare_ISA_class_accessor_and_check_instance (params, doc_class_name, bind_pos) thy = let val bname = Long_Name.base_name doc_class_name val bname' = prefix DOF_core.doc_class_prefix bname val bind = bname' |> pair bind_pos |> swap |> Binding.make val bind' = prefix DOF_core.long_doc_class_prefix bname |> pair bind_pos |> swap |> Binding.make - val const_typ = \<^typ>\string\ --> Syntax.read_typ (Proof_Context.init_global thy) doc_class_name + val typ = Type (doc_class_name, map TFree params) + val const_typ = \<^typ>\string\ --> typ fun mixfix_enclose name = name |> enclose "@{" " _}" val mixfix = clean_mixfix bname |> mixfix_enclose val mixfix' = clean_mixfix doc_class_name |> mixfix_enclose @@ -1184,6 +1270,7 @@ fun declare_ISA_class_accessor_and_check_instance (doc_class_name, bind_pos) thy |> Sign.add_consts [(bind', const_typ, Mixfix.mixfix mixfix')] |> DOF_core.add_isa_transformer bind' ((check_instance, elaborate_instance) |> DOF_core.make_isa_transformer) + end fun elaborate_instances_list thy isa_name _ _ _ = @@ -1204,23 +1291,23 @@ fun elaborate_instances_list thy isa_name _ _ _ = in HOLogic.mk_list class_typ values end -fun declare_class_instances_annotation (doc_class_name, bind_pos) thy = +fun declare_class_instances_annotation (params, doc_class_name, bind_pos) thy = let val bname = Long_Name.base_name doc_class_name val bname' = prefix DOF_core.doc_class_prefix bname |> suffix instances_of_suffixN val bind = bname' |> pair bind_pos |> swap |> Binding.make val bind' = prefix DOF_core.long_doc_class_prefix bname |> suffix instances_of_suffixN |> pair bind_pos |> swap |> Binding.make - val class_typ = doc_class_name |> Proof_Context.read_typ (Proof_Context.init_global thy) + val typ = Type (doc_class_name, map TFree params) fun mixfix_enclose name = name |> enclose "@{" "}" val mixfix = clean_mixfix (bname ^ instances_of_suffixN) |> mixfix_enclose val mixfix' = clean_mixfix (doc_class_name ^ instances_of_suffixN) |> mixfix_enclose in thy |> rm_mixfix bname' mixfix - |> Sign.add_consts [(bind, \<^Type>\list class_typ\, Mixfix.mixfix mixfix)] + |> Sign.add_consts [(bind, \<^Type>\list typ\, Mixfix.mixfix mixfix)] |> DOF_core.add_isa_transformer bind ((check_identity, elaborate_instances_list) |> DOF_core.make_isa_transformer) - |> Sign.add_consts [(bind', \<^Type>\list class_typ\, Mixfix.mixfix mixfix')] + |> Sign.add_consts [(bind', \<^Type>\list typ\, Mixfix.mixfix mixfix')] |> DOF_core.add_isa_transformer bind' ((check_identity, elaborate_instances_list) |> DOF_core.make_isa_transformer) end @@ -1323,12 +1410,10 @@ ML\ structure ODL_Meta_Args_Parser = struct +type meta_args_t = (binding * (string * Position.T) option) + * ((string * Position.T) * string) list -type meta_args_t = (((string * Position.T) * - (string * Position.T) option) - * ((string * Position.T) * string) list) - -val empty_meta_args = ((("", Position.none), NONE), []) +val empty_meta_args = ((Binding.empty, NONE), []) val is_improper = not o (Token.is_proper orf Token.is_begin_ignore orf Token.is_end_ignore); val improper = Scan.many is_improper; (* parses white-space and comments *) @@ -1348,7 +1433,7 @@ val attribute_upd : (((string * Position.T) * string) * string) parser = : (((string * Position.T) * string) * string) parser; val reference = - Parse.position Parse.name + Parse.binding --| improper -- Scan.option (Parse.$$$ "::" -- improper @@ -1373,6 +1458,7 @@ val attributes_upd = (Scan.optional(Parse.$$$ "," -- improper |-- (Parse.enum "," (improper |-- attribute_upd)))) [])) --| Parse.$$$ "]") --| improper + end (* structure ODL_Meta_Args_Parser *) \ @@ -1435,14 +1521,20 @@ fun value_select name ctxt = then default_value ctxt else Name_Space.get (Evaluators.get (Proof_Context.theory_of ctxt)) name ctxt; -fun value_select' raw_name ctxt = - if raw_name = "" - then (DOF_core.elaborate_term ctxt) #> default_value ctxt - else (DOF_core.elaborate_term ctxt) - #> (let val name = intern_evaluator (Proof_Context.theory_of ctxt) raw_name in - Name_Space.get (Evaluators.get (Proof_Context.theory_of ctxt)) name ctxt end); +fun value_select' raw_name ctxt raw_t = + let val thy = Proof_Context.theory_of ctxt + val t = Syntax.parse_term ctxt raw_t; + val t' = DOF_core.elaborate_term' ctxt t + val t'' = Syntax.check_term ctxt t' + in + if raw_name = "" + then t'' |> default_value ctxt + else let val name = intern_evaluator thy raw_name + in t'' |> Name_Space.get (Evaluators.get thy) name ctxt + end + end -val value = value_select' "" +val value = value_select "" val value_without_elaboration = value_select "" @@ -1459,13 +1551,12 @@ fun cid_2_cidType cid_long thy = in fold (fn x => fn y => Type(ty_name x,[y])) (fathers cid_long) \<^Type>\unit\ end -fun create_default_object thy class_name = +fun create_default_object thy binding class_name typ = let val purified_class_name = String.translate (fn #"." => "_" | x => String.implode [x]) class_name - val make_const = Syntax.read_term_global thy (Long_Name.qualify class_name makeN); - fun attr_to_free (binding, typ, _) = Free (purified_class_name ^ "_" - ^ (Binding.name_of binding) - ^ "_Attribute_Not_Initialized", typ) + fun attr_to_s (binding, _, _) = purified_class_name ^ "_" + ^ (Binding.name_of binding) + ^ "_Attribute_Not_Initialized" val class_list = DOF_core.get_attributes class_name thy fun attrs_filter [] = [] | attrs_filter (x::xs) = @@ -1479,44 +1570,48 @@ fun create_default_object thy class_name = else is_duplicated y xs end in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end val class_list' = rev (attrs_filter (rev class_list)) - val tag_attr = HOLogic.mk_number \<^Type>\int\ - fun add_tag_to_attrs_free' tag_attr thy (cid, filtered_attr_list) = + val tag_attr_s = serial () |> string_of_int + fun trans_attr thy trans tag_attr (cid, filtered_attr_list) = if DOF_core.virtual_of cid thy |> #virtual - then (tag_attr (serial ()))::(map (attr_to_free) filtered_attr_list) - else (map (attr_to_free) filtered_attr_list) - val class_list'' = flat (map (add_tag_to_attrs_free' tag_attr thy) class_list') - in list_comb (make_const, (tag_attr (serial()))::class_list'') end + then (tag_attr)::(map (trans) filtered_attr_list) + else (map (trans) filtered_attr_list) + val test_class = class_list' |> map (trans_attr thy (attr_to_s) tag_attr_s) + |> flat + |> cons tag_attr_s + val term = test_class |> cons (Long_Name.qualify class_name makeN) |> space_implode Symbol.space + val ctxt = Proof_Context.init_global thy + val term' = term |> Syntax.parse_term ctxt |> DOF_core.elaborate_term' ctxt + val parsed_prop = Const (\<^const_name>\Pure.eq\, dummyT) $ Free (Binding.name_of binding, dummyT) $ term' + val raw_vars = [(binding, SOME typ, NoSyn)] + val (_, vars_ctxt) = DOF_core.prep_decls Proof_Context.cert_var raw_vars ctxt + val concl = Syntax.check_prop vars_ctxt parsed_prop + in Logic.dest_equals concl |> snd end -fun check_classref {is_monitor=is_monitor} (SOME(cid,pos)) thy = - let - val cid_long = DOF_core.get_onto_class_name_global' cid thy - val DOF_core.Onto_Class {rex, ...} = DOF_core.get_onto_class_global cid_long thy - val _ = if is_monitor andalso (null rex orelse cid_long= DOF_core.default_cid ) - then error("should be monitor class!") - else () - val ctxt = Proof_Context.init_global thy - val onto_classes = DOF_core.get_onto_classes ctxt - val markup = DOF_core.get_onto_class_name_global cid_long thy - |> Name_Space.markup (Name_Space.space_of_table onto_classes) - val _ = Context_Position.report ctxt pos markup; - in (cid_long, pos) - end - | check_classref _ NONE _ = (DOF_core.default_cid, Position.none) +fun check_classref {is_monitor=is_monitor} (SOME (cid, pos)) thy = + let + val ctxt = Proof_Context.init_global thy + val name_cid_typ = DOF_core.get_onto_class_cid thy cid + val cid_long = name_cid_typ |> (fst o fst) + val rex = DOF_core.rex_of cid_long thy + val _ = if is_monitor andalso (null rex orelse cid_long= DOF_core.default_cid ) + then error("should be monitor class!") + else () + val onto_classes = DOF_core.get_onto_classes ctxt + val markup = DOF_core.get_onto_class_name_global cid_long thy + |> Name_Space.markup (Name_Space.space_of_table onto_classes) + val _ = Context_Position.report ctxt pos markup; + in (name_cid_typ, pos) + end + | check_classref _ NONE _ = pair DOF_core.default_cid DOF_core.default_cid + |> rpair \<^typ>\unit\ + |> rpair Position.none - -fun generalize_typ n = Term.map_type_tfree (fn (str,sort)=> Term.TVar((str,n),sort)); -fun infer_type thy term = hd (Type_Infer_Context.infer_types (Proof_Context.init_global thy) [term]) - - -fun calc_update_term {mk_elaboration=mk_elaboration} thy cid_long +fun calc_update_term {mk_elaboration=mk_elaboration} thy (name, typ) (S:(string * Position.T * string * term)list) term = - let val cid_ty = cid_2_cidType cid_long thy + let val cid_long = name + val cid_ty = typ val ctxt = Proof_Context.init_global thy - val generalize_term = Term.map_types (generalize_typ 0) - fun toString t = Syntax.string_of_term ctxt t - fun instantiate_term S t = - Term_Subst.map_types_same (Term_Subst.instantiateT (TVars.make S)) (t) fun read_assn (lhs, pos:Position.T, opr, rhs) term = let fun get_class_name parent_cid attribute_name pos = @@ -1532,8 +1627,7 @@ fun calc_update_term {mk_elaboration=mk_elaboration} thy cid_long NONE => ISA_core.err ("Attribute " ^ attribute_name ^ " not defined for class: " ^ cid_long) pos - | SOME (_, parent_name) => - get_class_name parent_name attribute_name pos + | SOME (_, parent_name) => get_class_name parent_name attribute_name pos end val _ = if mk_elaboration then @@ -1544,7 +1638,7 @@ fun calc_update_term {mk_elaboration=mk_elaboration} thy cid_long in Context_Position.report ctxt pos markup end else () val info_opt = DOF_core.get_attribute_info cid_long (Long_Name.base_name lhs) thy - val (ln,lnt,lnu,lnut) = case info_opt of + val (ln,lnt,lnu,_) = case info_opt of NONE => error ("unknown attribute >" ^((Long_Name.base_name lhs)) ^"< in class: "^cid_long) @@ -1552,28 +1646,30 @@ fun calc_update_term {mk_elaboration=mk_elaboration} thy cid_long long_name ^ Record.updateN, (typ --> typ) --> cid_ty --> cid_ty) - val tyenv = Sign.typ_match thy ((generalize_typ 0)(type_of rhs), lnt) (Vartab.empty) - handle Type.TYPE_MATCH => (error ("type of attribute: " ^ ln - ^ " does not fit to term: " - ^ toString rhs)); - val tyenv' = (map (fn (s,(t,u)) => ((s,t),u)) (Vartab.dest tyenv)) val _ = if Long_Name.base_name lhs = lhs orelse ln = lhs then () else error("illegal notation for attribute of "^cid_long) fun join (ttt as \<^Type>\int\) = \<^Const>\Groups.plus ttt\ - |join (ttt as \<^Type>\set _\) = \<^Const>\Lattices.sup ttt\ - |join \<^Type>\list A\ = \<^Const>\List.append A\ + |join (ttt as \<^Type>\set _\) = \<^Const>\Lattices.sup dummyT\ + |join \<^Type>\list A\ = \<^Const>\List.append dummyT\ |join _ = error("implicit fusion operation not defined for attribute: "^ lhs) (* could be extended to bool, map, multisets, ... *) - val rhs' = instantiate_term tyenv' (generalize_term rhs) - val rhs'' = DOF_core.transduce_term_global {mk_elaboration=mk_elaboration} - (rhs',pos) thy in case opr of - "=" => Const(lnu,lnut) $ Abs ("uu_", lnt, rhs'') $ term - | ":=" => Const(lnu,lnut) $ Abs ("uu_", lnt, rhs'') $ term - | "+=" => Const(lnu,lnut) $ Abs ("uu_", lnt, join lnt $ (Bound 0) $ rhs'') $ term + "=" => Const(lnu, dummyT) $ Abs ("uu_", dummyT, rhs) $ term + | ":=" => Const(lnu, dummyT) $ Abs ("uu_", dummyT, rhs) $ term + | "+=" => Const(lnu, dummyT) $ Abs ("uu_u", dummyT, join lnt $ (Bound 0) $ rhs) $ term | _ => error "corrupted syntax - oops - this should not occur" - end - in Sign.certify_term thy (fold read_assn S term) end + end + val t = fold read_assn S term + val t' = if mk_elaboration + then DOF_core.elaborate_term' ctxt t + else DOF_core.check_term' ctxt t + in if t = term + then Sign.certify_term thy t' + else + let + val concl = Syntax.check_term ctxt t'; + in Sign.certify_term thy concl end + end fun msg thy txt pos = if Config.get_global thy DOF_core.strict_monitor_checking then ISA_core.err txt pos @@ -1584,9 +1680,10 @@ fun msg_intro get n oid cid = ("accepts clause " ^ Int.toString n ^ get (" not enabled for", " rejected") ^ " doc_class: " ^ cid) -fun register_oid_cid_in_open_monitors oid _ cid_pos thy = - let val cid_long= fst cid_pos - val pos' = snd cid_pos +fun register_oid_cid_in_open_monitors binding (name, pos') thy = + let + val oid = Binding.name_of binding + val cid_long= name fun is_enabled (n, monitor_info) = if exists (DOF_core.is_subclass_global thy cid_long) (DOF_core.alphabet_of n thy) @@ -1642,11 +1739,11 @@ fun register_oid_cid_in_open_monitors oid _ cid_pos thy = val trace_attr = if defined then trace_attr_t cid_long oid else [] - fun cid_of oid = let val DOF_core.Instance params = DOF_core.get_instance_global oid thy - in #cid params end + fun mon_cid oid = DOF_core.cid_of oid thy |> DOF_core.get_onto_class_cid thy + |> (fn ((name, _), typ) => (name, typ)) val ctxt = Proof_Context.init_global thy fun def_trans_value oid = - (#1 o (calc_update_term {mk_elaboration=true} thy (cid_of oid) trace_attr)) + (#1 o (calc_update_term {mk_elaboration=true} thy (mon_cid oid) trace_attr)) #> value ctxt val _ = if null enabled_monitors then () @@ -1671,7 +1768,7 @@ fun register_oid_cid_in_open_monitors oid _ cid_pos thy = fun update_trace mon_oid = if Config.get_global thy DOF_core.object_value_debug then let fun def_trans_input_term oid = - #1 o (calc_update_term {mk_elaboration=false} thy (cid_of oid) trace_attr) + #1 o (calc_update_term {mk_elaboration=false} thy (mon_cid oid) trace_attr) in DOF_core.map_input_term_value mon_oid (def_trans_input_term mon_oid) (def_trans_value mon_oid) end else DOF_core.map_value mon_oid (def_trans_value mon_oid) @@ -1683,16 +1780,18 @@ fun register_oid_cid_in_open_monitors oid _ cid_pos thy = DOF_core.Monitor_Info.map (fold update_info delta_autoS) end -fun check_invariants thy (oid, _) = +fun check_invariants thy binding = let + val oid = Binding.name_of binding val docitem_value = DOF_core.value_of oid thy - val cid = DOF_core.cid_of oid thy + val name = DOF_core.cid_of oid thy + |> DOF_core.get_onto_class_cid thy |> (fst o fst) fun get_all_invariants cid thy = - case DOF_core.get_onto_class_global cid thy of - DOF_core.Onto_Class {inherits_from=NONE, invs, ...} => single (cid, invs) - | DOF_core.Onto_Class {inherits_from=SOME(_,father), invs, ...} => - (cid, invs) :: (get_all_invariants father thy) - val cids_invariants = get_all_invariants cid thy + case DOF_core.get_onto_class_global cid thy of + DOF_core.Onto_Class {inherits_from=NONE, invs, ...} => single (cid, invs) + | DOF_core.Onto_Class {inherits_from=SOME(_, father), invs, ...} => + (cid, invs) :: get_all_invariants father thy + val cids_invariants = get_all_invariants name thy val inv_and_apply_list = let fun mk_inv_and_apply cid_invs value thy = let val (cid_long, invs) = cid_invs @@ -1725,7 +1824,7 @@ fun check_invariants thy (oid, _) = val thms = Proof_Context.get_thms ctxt (inv_name ^ def_suffixN) (* Get the make definition (def(1) of the record) *) val thms' = - (Proof_Context.get_thms ctxt (Long_Name.append cid defsN)) @ thms + (Proof_Context.get_thms ctxt (Long_Name.append name defsN)) @ thms val _ = Goal.prove ctxt [] [] prop_term (K ((unfold_tac ctxt thms') THEN (auto_tac ctxt))) |> Thm.close_derivation \<^here> @@ -1764,15 +1863,18 @@ fun check_invariants thy (oid, _) = in thy end -fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} oid pos cid_pos doc_attrs thy = +fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} binding cid_pos doc_attrs thy = let - val cid_pos' = check_classref is_monitor cid_pos thy + val oid = Binding.name_of binding + val (((name, args_cid), typ), pos') = check_classref is_monitor cid_pos thy + val cid_pos' = (name, pos') val cid_long = fst cid_pos' - val default_cid = cid_long = DOF_core.default_cid - val vcid = case cid_pos of NONE => NONE - | SOME (cid,_) => if DOF_core.virtual_of cid thy |> #virtual - then SOME (DOF_core.get_onto_class_name_global' cid thy) - else NONE + val default_cid = args_cid = DOF_core.default_cid + val vcid = if default_cid + then NONE + else if DOF_core.virtual_of cid_long thy |> #virtual + then SOME args_cid + else NONE val value_terms = if default_cid then let val undefined_value = Free ("Undefined_Value", \<^Type>\unit\) @@ -1782,28 +1884,28 @@ fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} oi without using the burden of ontology classes. ex: text*[sdf]\ Lorem ipsum @{thm refl}\ *) else let - val defaults_init = create_default_object thy cid_long - fun conv (na, _(*ty*), term) =(Binding.name_of na, Binding.pos_of na, "=", term); + fun conv_attrs ((lhs, pos), rhs) = (YXML.content_of lhs,pos,"=", Syntax.parse_term (Proof_Context.init_global thy) rhs) + val assns' = map conv_attrs doc_attrs + val defaults_init = create_default_object thy binding cid_long typ + fun conv (na, _(*ty*), parsed_term) =(Binding.name_of na, Binding.pos_of na, "=", parsed_term); val S = map conv (DOF_core.get_attribute_defaults cid_long thy); val (defaults, _(*ty*), _) = calc_update_term {mk_elaboration=false} - thy cid_long S defaults_init; - fun conv_attrs ((lhs, pos), rhs) = (YXML.content_of lhs,pos,"=", Syntax.read_term_global thy rhs) - val assns' = map conv_attrs doc_attrs + thy (name, typ) S defaults_init; val (value_term', _(*ty*), _) = calc_update_term {mk_elaboration=true} - thy cid_long assns' defaults + thy (name, typ) assns' defaults in if Config.get_global thy DOF_core.object_value_debug then let val (input_term, _(*ty*), _) = calc_update_term {mk_elaboration=false} - thy cid_long assns' defaults + thy (name, typ) assns' defaults in (input_term, value_term') end else (\<^term>\()\, value_term') end in thy |> DOF_core.define_object_global - {define = define} ((oid, pos), DOF_core.make_instance + {define = define} (binding, DOF_core.make_instance (false, fst value_terms, (snd value_terms) |> value (Proof_Context.init_global thy), - is_inline, cid_long, vcid)) - |> register_oid_cid_in_open_monitors oid pos cid_pos' + is_inline, args_cid, vcid)) + |> register_oid_cid_in_open_monitors binding (name, pos') |> (fn thy => if (* declare_reference* without arguments is not checked against invariants *) thy |> DOF_core.defined_of oid |> not @@ -1821,27 +1923,27 @@ fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} oi ex: text*[sdf]\ Lorem ipsum @{thm refl}\ *) |> (fn thy => if default_cid then thy else if Config.get_global thy DOF_core.invariants_checking - then check_invariants thy (oid, pos) else thy)) + then check_invariants thy binding else thy)) end end (* structure Docitem_Parser *) -fun meta_args_exec (meta_args as (((oid, pos), cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) thy = +fun meta_args_exec (meta_args as ((binding, cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) thy = thy |> (if meta_args = ODL_Meta_Args_Parser.empty_meta_args then (K thy) else Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline = true} {define = true} - oid pos (I cid_pos) (I doc_attrs)) + binding (I cid_pos) (I doc_attrs)) fun value_cmd {assert=assert} meta_args_opt raw_name modes raw_t pos thy = let val thy' = meta_args_exec meta_args_opt thy val name = intern_evaluator thy' raw_name; - val t = Syntax.read_term_global thy' raw_t; - val term' = DOF_core.transduce_term_global {mk_elaboration=true} (t , pos) - (thy'); - val t' = value_select name (Proof_Context.init_global thy') term'; + val t = Syntax.parse_term (Proof_Context.init_global thy') raw_t; + val term' = DOF_core.parsed_elaborate (Proof_Context.init_global thy') pos t + val term'' = Syntax.check_term (Proof_Context.init_global thy') term' + val t' = value_select name (Proof_Context.init_global thy') term''; val ty' = Term.type_of t'; val ty' = if assert then case ty' of @@ -1866,7 +1968,7 @@ val opt_modes = val opt_evaluator = Scan.optional (\<^keyword>\[\ |-- Parse.name --| \<^keyword>\]\) ""; -fun pass_trans_to_value_cmd meta_args_opt ((name, modes), t) trans = +fun pass_trans_to_value_cmd (((name, meta_args_opt), modes), t) trans = let val pos = Toplevel.pos_of trans in trans |> Toplevel.theory (value_cmd {assert=false} meta_args_opt name modes t pos) @@ -1910,7 +2012,7 @@ val (disable_assert_evaluation, disable_assert_evaluation_setup) val _ = Theory.setup disable_assert_evaluation_setup -fun pass_trans_to_assert_value_cmd meta_args_opt ((name, modes), t) trans = +fun pass_trans_to_assert_value_cmd (((name, meta_args_opt), modes), t) trans = let val pos = Toplevel.pos_of trans in trans |> Toplevel.theory @@ -1923,6 +2025,7 @@ in trans end (* setup ontology aware commands *) + val _ = Outer_Syntax.command \<^command_keyword>\term*\ "read and print term" (ODL_Meta_Args_Parser.opt_attributes -- (opt_modes -- Parse.term) @@ -1930,31 +2033,31 @@ val _ = val _ = Outer_Syntax.command \<^command_keyword>\value*\ "evaluate and print term" - (ODL_Meta_Args_Parser.opt_attributes -- (opt_evaluator -- opt_modes -- Parse.term) - >> (uncurry pass_trans_to_value_cmd)); - + ((opt_evaluator -- ODL_Meta_Args_Parser.opt_attributes -- opt_modes -- Parse.term) + >> (pass_trans_to_value_cmd)); + val _ = Outer_Syntax.command \<^command_keyword>\assert*\ "evaluate and assert term" - (ODL_Meta_Args_Parser.opt_attributes -- (opt_evaluator -- opt_modes -- Parse.term) - >> uncurry pass_trans_to_assert_value_cmd); + ((opt_evaluator -- ODL_Meta_Args_Parser.opt_attributes -- opt_modes -- Parse.term) + >> pass_trans_to_assert_value_cmd); (* setup ontology - aware text and ML antiquotations. Due to lexical restrictions, we can not declare them as value* or term*, although we will refer to them this way in papers. *) -local +local fun pretty_term_style ctxt (style: term -> term, t) = - Document_Output.pretty_term ctxt (style (DOF_core.check_term ctxt t)); + Document_Output.pretty_term ctxt (style (DOF_core.check_term ctxt t)) fun print_term ctxt t = ML_Syntax.print_term (DOF_core.check_term (Context.proof_of ctxt) t) in val _ = Theory.setup (Document_Output.antiquotation_pretty_source_embedded \<^binding>\value_\ - (Scan.lift opt_evaluator -- Term_Style.parse -- Args.term) + (Scan.lift opt_evaluator -- Term_Style.parse -- Scan.lift Parse.term) (fn ctxt => fn ((name, style), t) => Document_Output.pretty_term ctxt (style (value_select' name ctxt t))) #> ML_Antiquotation.inline_embedded \<^binding>\value_\ - ((Scan.lift opt_evaluator -- Args.term) + ((Scan.lift (opt_evaluator -- Parse.term)) #> (fn ((name, t),(ctxt, ts)) => (((value_select' name (Context.proof_of ctxt) t) |> (ML_Syntax.atomic o (print_term ctxt))), (ctxt, ts)))) @@ -1977,78 +2080,96 @@ end; (* structure Value_Command *) structure Monitor_Command_Parser = struct -fun update_instance_command (((oid, pos), cid_pos), +fun update_instance_command ((binding, cid_pos), doc_attrs: (((string*Position.T)*string)*string)list) thy : theory = - let val cid = let val DOF_core.Instance {cid,...} = - DOF_core.get_instance_global oid thy - val ctxt = Proof_Context.init_global thy - val instances = DOF_core.get_instances ctxt - val markup = DOF_core.get_instance_name_global oid thy - |> Name_Space.markup (Name_Space.space_of_table instances) - val _ = Context_Position.report ctxt pos markup; - in cid end - val cid_pos' = Value_Command.Docitem_Parser.check_classref {is_monitor = false} - cid_pos thy - val cid_long = fst cid_pos' - val _ = if cid_long = DOF_core.default_cid orelse cid = cid_long - then () - else error("incompatible classes:"^cid^":"^cid_long) - - fun conv_attrs (((lhs, pos), opn), rhs) = ((YXML.content_of lhs),pos,opn, - Syntax.read_term_global thy rhs) - val assns' = map conv_attrs doc_attrs - val def_trans_value = - #1 o (Value_Command.Docitem_Parser.calc_update_term {mk_elaboration=true} - thy cid_long assns') - #> Value_Command.value (Proof_Context.init_global thy) - in - thy |> (if Config.get_global thy DOF_core.object_value_debug - then let val def_trans_input_term = - #1 o (Value_Command.Docitem_Parser.calc_update_term - {mk_elaboration=false} thy cid_long assns') - in DOF_core.map_input_term_value oid - def_trans_input_term def_trans_value end - else DOF_core.map_value oid def_trans_value) - |> tap (DOF_core.check_ml_invs cid_long oid {is_monitor=false}) - |> (fn thy => if Config.get_global thy DOF_core.invariants_checking - then Value_Command.Docitem_Parser.check_invariants thy (oid, pos) - else thy) - end + let val oid = Binding.name_of binding + val cid = let val DOF_core.Instance {cid,...} = + DOF_core.get_instance_global oid thy + val ctxt = Proof_Context.init_global thy + val instances = DOF_core.get_instances ctxt + val markup = DOF_core.get_instance_name_global oid thy + |> Name_Space.markup (Name_Space.space_of_table instances) + val _ = Context_Position.report ctxt (Binding.pos_of binding) markup; + in cid end + val default_cid = cid = DOF_core.default_cid + val (((name, cid'), typ), pos') = Value_Command.Docitem_Parser.check_classref {is_monitor = false} + cid_pos thy + val cid_pos' = (name, pos') + val cid_long = fst cid_pos' + val _ = if cid' = DOF_core.default_cid orelse cid = cid' + then () + else error("incompatible classes:"^cid^":"^cid') + fun conv_attrs (((lhs, pos), opn), rhs) = ((YXML.content_of lhs),pos,opn, + Syntax.parse_term (Proof_Context.init_global thy) rhs) + val assns' = map conv_attrs doc_attrs + val def_trans_value = + #1 o (Value_Command.Docitem_Parser.calc_update_term {mk_elaboration=true} + thy (name, typ) assns') + #> Value_Command.value (Proof_Context.init_global thy) + in + thy |> (if Config.get_global thy DOF_core.object_value_debug + then let val def_trans_input_term = + #1 o (Value_Command.Docitem_Parser.calc_update_term + {mk_elaboration=false} thy (name, typ) assns') + in DOF_core.map_input_term_value oid + def_trans_input_term def_trans_value end + else DOF_core.map_value oid def_trans_value) + |> tap (DOF_core.check_ml_invs cid_long oid {is_monitor=false}) + (* Bypass checking of high-level invariants when the class default_cid = "text", + the top (default) document class. + We want the class default_cid to stay abstract + and not have the capability to be defined with attribute, invariants, etc. + Hence this bypass handles docitem without a class associated, + for example when you just want a document element to be referentiable + without using the burden of ontology classes. + ex: text*[sdf]\ Lorem ipsum @{thm refl}\ *) + |> (fn thy => if default_cid then thy + else if Config.get_global thy DOF_core.invariants_checking + then Value_Command.Docitem_Parser.check_invariants thy binding + else thy) + end (* General criticism : attributes like "level" were treated here in the kernel instead of dragging them out into the COL -- bu *) -fun open_monitor_command ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) thy = - let fun o_m_c oid pos cid_pos doc_attrs thy = + +fun open_monitor_command (((binding, raw_parent_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) thy = + let fun o_m_c binding params_cid_pos doc_attrs thy = Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor=true} (* this is a monitor *) {is_inline=false} (* monitors are always inline *) {define=true} - oid pos cid_pos doc_attrs thy + binding params_cid_pos doc_attrs thy fun compute_enabled_set cid thy = let val DOF_core.Onto_Class X = DOF_core.get_onto_class_global' cid thy val ralph = RegExpInterface.alphabet (#rejectS X) val aalph = RegExpInterface.alphabet (#rex X) in (aalph, ralph, map (RegExpInterface.rexp_term2da thy aalph)(#rex X)) end - fun create_monitor_entry oid thy = - let val cid = case cid_pos of - NONE => ISA_core.err ("You must specified a monitor class.") pos - | SOME (cid, _) => cid - val (accS, rejectS, aS) = compute_enabled_set cid thy - in DOF_core.add_monitor_info (Binding.make (oid, pos)) - (DOF_core.make_monitor_info (accS, rejectS, aS)) thy - end + fun create_monitor_entry thy = + let val cid = case raw_parent_pos of + NONE => ISA_core.err ("You must specified a monitor class.") (Binding.pos_of binding) + | SOME (raw_parent, _) => + DOF_core.markup2string raw_parent + |> DOF_core.get_onto_class_cid thy |> (fst o fst) + val (accS, rejectS, aS) = compute_enabled_set cid thy + in DOF_core.add_monitor_info binding + (DOF_core.make_monitor_info (accS, rejectS, aS)) thy + end in - thy |> o_m_c oid pos cid_pos doc_attrs |> create_monitor_entry oid + thy + |> o_m_c binding raw_parent_pos doc_attrs + |> create_monitor_entry end; -fun close_monitor_command (args as (((oid, pos), _), +fun close_monitor_command (args as ((binding, cid_pos), _: (((string*Position.T)*string)*string)list)) thy = - let fun check_if_final aS = let val i = (find_index (not o RegExpInterface.final) aS) + 1 + let val oid = Binding.name_of binding + val pos = Binding.pos_of binding + fun check_if_final aS = let val i = (find_index (not o RegExpInterface.final) aS) + 1 in if i >= 1 then Value_Command.Docitem_Parser.msg thy @@ -2072,22 +2193,28 @@ fun close_monitor_command (args as (((oid, pos), _), |> update_instance_command args |> tap (DOF_core.check_ml_invs cid_long oid {is_monitor=true}) |> (fn thy => if Config.get_global thy DOF_core.invariants_checking - then Value_Command.Docitem_Parser.check_invariants thy (oid, pos) + then Value_Command.Docitem_Parser.check_invariants thy binding else thy) |> delete_monitor_entry end fun meta_args_2_latex thy sem_attrs transform_attr - ((((lab, pos), cid_opt), attr_list) : ODL_Meta_Args_Parser.meta_args_t) = + (((binding, cid_opt), attr_list) : ODL_Meta_Args_Parser.meta_args_t) = (* for the moment naive, i.e. without textual normalization of attribute names and adapted term printing *) - let val l = DOF_core.get_instance_name_global lab thy |> DOF_core.output_name + let val lab = Binding.name_of binding + val l = DOF_core.get_instance_name_global lab thy |> DOF_core.output_name |> enclose "{" "}" |> prefix "label = " - val cid_long = case cid_opt of - NONE => DOF_core.cid_of lab thy - | SOME(cid,_) => DOF_core.get_onto_class_name_global' cid thy + val ((cid_long, _), _) = case cid_opt of + NONE => let val DOF_core.Instance cid = + DOF_core.get_instance_global lab thy + in pair (cid |> #cid) (cid |> #cid) + |> rpair \<^typ>\unit\ end + + | SOME(cid,_) => DOF_core.get_onto_class_cid thy cid + val cid_txt = cid_long |> DOF_core.output_name |> enclose "{" "}" |> prefix "type = " @@ -2123,15 +2250,16 @@ fun meta_args_2_latex thy sem_attrs transform_attr in str_of_term end - fun toLong n = #long_name(the(DOF_core.get_attribute_info cid_long (markup2string n) thy)) - + fun toLong n = #long_name(the(DOF_core.get_attribute_info cid_long (markup2string n) thy)) val ctxt = Proof_Context.init_global thy val actual_args = map (fn ((lhs,_),rhs) => (toLong lhs, ltx_of_markup ctxt rhs)) - attr_list - val default_args = map (fn (b,_,t) => (toLong (Long_Name.base_name ( Sign.full_name thy b)), - ltx_of_term ctxt true t)) - (DOF_core.get_attribute_defaults cid_long thy) - + attr_list + val default_args = + (DOF_core.get_attribute_defaults cid_long thy) + |> map (fn (b,_, parsed_term) => + (toLong (Long_Name.base_name ( Sign.full_name thy b)) + , ltx_of_term ctxt true (Syntax.check_term ctxt parsed_term))) + val default_args_filtered = filter (fn (a,_) => not (exists (fn b => b = a) (map (fn (c,_) => c) actual_args))) default_args val transformed_args = (actual_args@default_args_filtered) @@ -2146,7 +2274,7 @@ fun meta_args_2_latex thy sem_attrs transform_attr then (name, DOF_core.output_name value') else ISA_core.err ("Bad name of semantic attribute" ^ name ^ ": " ^ value - ^ ". Name must be ASCII") pos + ^ ". Name must be ASCII") (Binding.pos_of binding) else (name, value') end) in update_sem_std_attrs attrs attrs'' end val updated_sem_std_attrs = update_sem_std_attrs sem_attrs transformed_args @@ -2160,16 +2288,17 @@ fun meta_args_2_latex thy sem_attrs transform_attr Latex.string (enclose "[" "]" (String.concat [ label_and_type, ", args={", (commas str_args), "}"])) end + (* level-attribute information management *) fun gen_enriched_document_cmd {inline} cid_transform attr_transform - ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) : theory -> theory = + (((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) : theory -> theory = Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline = inline} - {define = true} oid pos (cid_transform cid_pos) (attr_transform doc_attrs); + {define = true} binding (cid_transform cid_pos) (attr_transform doc_attrs); fun gen_enriched_document_cmd' {inline} cid_transform attr_transform - ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) : theory -> theory = + (((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) : theory -> theory = Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline = inline} - {define = false} oid pos (cid_transform cid_pos) (attr_transform doc_attrs); + {define = false} binding (cid_transform cid_pos) (attr_transform doc_attrs); (* markup reports and document output *) @@ -2232,17 +2361,16 @@ val _ = (ODL_Meta_Args_Parser.attributes_upd >> (Toplevel.theory o close_monitor_command)); - val _ = Outer_Syntax.command \<^command_keyword>\update_instance*\ "update meta-attributes of an instance of a document class" (ODL_Meta_Args_Parser.attributes_upd - >> (Toplevel.theory o update_instance_command)); + >> (Toplevel.theory o update_instance_command)); val _ = document_command \<^command_keyword>\text*\ "formal comment (primary style)" {markdown = true, body = true} (gen_enriched_document_cmd {inline=true} I I) [] I; - + (* This is just a stub at present *) val _ = @@ -2383,13 +2511,12 @@ ML\ structure ML_star_Command = struct -fun meta_args_exec (meta_args as (((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) ctxt = +fun meta_args_exec (meta_args as ((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) ctxt = ctxt |> (if meta_args = ODL_Meta_Args_Parser.empty_meta_args then (K ctxt) else Context.map_theory (Value_Command.Docitem_Parser.create_and_check_docitem {is_monitor = false} {is_inline = true} - {define = true} oid pos (I cid_pos) (I doc_attrs)) -) + {define = true} binding (I cid_pos) (I doc_attrs))) val attributes_opt = Scan.option ODL_Meta_Args_Parser.attributes @@ -2424,19 +2551,6 @@ fun get_positions ctxt x = | get _ _ = []; in get [] end; -fun prep_decls prep_var raw_vars ctxt = - let - val (vars, ctxt') = fold_map prep_var raw_vars ctxt; - val (xs, ctxt'') = ctxt' - |> Context_Position.set_visible false - |> Proof_Context.add_fixes vars - ||> Context_Position.restore_visible ctxt'; - val _ = - Context_Position.reports ctxt'' - (map (Binding.pos_of o #1) vars ~~ - map (Variable.markup_entity_def ctxt'' ##> Properties.remove Markup.kindN) xs); - in ((vars, xs), ctxt'') end; - fun dummy_frees ctxt xs tss = let val names = @@ -2447,19 +2561,18 @@ fun dummy_frees ctxt xs tss = fun prep_spec_open prep_var parse_prop raw_vars raw_params raw_prems raw_concl ctxt = let - val ((vars, xs), vars_ctxt) = prep_decls prep_var raw_vars ctxt; + val ((vars, xs), vars_ctxt) = DOF_core.prep_decls prep_var raw_vars ctxt; val (ys, params_ctxt) = vars_ctxt |> fold_map prep_var raw_params |-> Proof_Context.add_fixes; - val props = map (parse_prop params_ctxt) (raw_concl :: raw_prems) |> singleton (dummy_frees params_ctxt (xs @ ys)); - val concl :: prems = Syntax.check_props params_ctxt props; + val props' = props |> map (DOF_core.elaborate_term' ctxt) + val concl :: prems = Syntax.check_props params_ctxt props'; val spec = Logic.list_implies (prems, concl); - val spec' = DOF_core.elaborate_term ctxt spec - val spec_ctxt = Variable.declare_term spec' params_ctxt; + val spec_ctxt = Variable.declare_term spec params_ctxt; fun get_pos x = maps (get_positions spec_ctxt x) props; - in ((vars, xs, get_pos, spec'), spec_ctxt) end; + in ((vars, xs, get_pos, spec), spec_ctxt) end; val read_spec_open = prep_spec_open Proof_Context.read_var Syntax.parse_prop; @@ -2648,7 +2761,8 @@ val _ = theorem \<^command_keyword>\proposition*\ false "propositio val _ = theorem \<^command_keyword>\schematic_goal*\ true "schematic goal"; in end \ - + + section\ Syntax for Ontological Antiquotations (the '' View'' Part II) \ ML\ @@ -2677,7 +2791,10 @@ fun check_and_mark ctxt cid_decl ({strict_checking = strict}) {inline=inline_req val markup = name' |> Name_Space.markup (Name_Space.space_of_table instances) (* this sends a report for a ref application to the PIDE interface ... *) val _ = Context_Position.report ctxt pos markup; - val _ = if not(DOF_core.is_subclass ctxt cid cid_decl) + val cid' = if cid = DOF_core.default_cid + then cid + else DOF_core.get_onto_class_cid thy cid |> (fst o fst) + val _ = if not(DOF_core.is_subclass ctxt cid' cid_decl) then error("reference ontologically inconsistent: " ^ name ^ " is an instance of " ^ cid ^ " and " ^ cid @@ -2727,7 +2844,6 @@ fun docitem_antiquotation bind cid = Document_Output.antiquotation_raw bind docitem_antiquotation_parser (pretty_docitem_antiquotation_generic cid) - fun check_and_mark_term ctxt oid = let val ctxt' = Context.proof_of ctxt @@ -2903,28 +3019,22 @@ fun read_parent NONE ctxt = (NONE, ctxt) | T => error ("Bad parent record specification: " ^ Syntax.string_of_typ ctxt T)); - - fun read_fields raw_fields ctxt = let - val Ts = Syntax.read_typs ctxt (map (fn ((_, raw_T, _),_) => raw_T) raw_fields); - val terms = map ((Option.map (Syntax.read_term ctxt)) o snd) raw_fields - fun test t1 t2 = Sign.typ_instance (Proof_Context.theory_of ctxt) - (t1, Value_Command.Docitem_Parser.generalize_typ 0 t2) - fun check_default (ty,SOME trm) = - let val ty' = (type_of trm) - in if test ty ty' - then () - else error("type mismatch:"^ - (Syntax.string_of_typ ctxt ty')^":"^ - (Syntax.string_of_typ ctxt ty)) - end - (* BAD STYLE : better would be catching exn. *) - |check_default (_,_) = () - val fields = map2 (fn ((x, _, mx),_) => fn T => (x, T, mx)) raw_fields Ts; - val _ = map check_default (Ts ~~ terms) (* checking types conform to defaults *) - val ctxt' = fold Variable.declare_typ Ts ctxt; - in (fields, terms, ctxt') end; + val fields' = raw_fields |> map (apfst (fn (b, ty, mx) => (b, Syntax.read_typ ctxt ty, mx))) + |> map (apsnd (Option.map (Syntax.parse_term ctxt #> DOF_core.elaborate_term' ctxt))) + fun check_default ctxt ((b, ty, _), SOME trm) = + let val parsed_prop = Const (\<^const_name>\Pure.eq\, dummyT) $ Free (Binding.name_of b, dummyT) $ trm + val raw_vars = [(b, SOME ty, NoSyn)] + val (_, vars_ctxt) = DOF_core.prep_decls Proof_Context.cert_var raw_vars ctxt + (* check typ unification *) + val _ = Syntax.check_prop vars_ctxt parsed_prop + in () end + (* BAD STYLE : better would be catching exn. *) + | check_default _ (_, NONE)= () + val _ = map (check_default ctxt) fields' + val ctxt' = fields' |> map (#2 o fst) |> (fn Ts => fold Variable.declare_typ Ts ctxt); + in (map fst fields', map snd fields', ctxt') end; fun def_cmd (decl, spec, prems, params) lthy = let @@ -2942,10 +3052,8 @@ fun define_cond bind f_sty read_cond (ctxt:local_theory) = val args = (SOME(bind,NONE,NoSyn), (Binding.empty_atts,eq),[],[]) in def_cmd args ctxt end -fun define_inv cid_long (bind, inv) thy = +fun define_inv (params, cid_long) (bind, inv) thy = let val inv_term = Syntax.read_term (Proof_Context.init_global thy) inv - (* Rewrite selectors types to allow invariants on attributes of the superclasses - using the polymorphic type of the class *) fun update_attribute_type thy class_scheme_ty cid_long (Const (s, Type (st,[ty, ty'])) $ t) = let @@ -2955,7 +3063,7 @@ fun define_inv cid_long (bind, inv) thy = NONE => Const (s, Type(st,[ty, ty'])) $ (update_attribute_type thy class_scheme_ty cid_long t) | SOME _ => if DOF_core.is_subclass_global thy cid_long cid - then let val Type(_, tys') = ty + then let val Type(st', tys') = ty in if tys' = [\<^typ>\unit\] then Const (s, Type(st,[ty, ty'])) $ (update_attribute_type thy class_scheme_ty cid_long t) @@ -2974,31 +3082,34 @@ fun define_inv cid_long (bind, inv) thy = then Free (s, class_scheme_ty) else Free (s, ty) | update_attribute_type _ _ _ t = t - val inv_ty = Syntax.read_typ (Proof_Context.init_global thy) - (Name.aT ^ " " ^ cid_long ^ schemeN) + val zeta = (singleton (Name.variant_list (map #1 params)) "'z", \<^sort>\type\) + val typ = Type (cid_long ^ schemeN, map TFree (params @ [zeta])) (* Update the type of each attribute update function to match the type of the current class. *) - val inv_term' = update_attribute_type thy inv_ty cid_long inv_term - val eq_inv_ty = inv_ty --> HOLogic.boolT - val abs_term = Term.lambda (Free (instance_placeholderN, inv_ty)) inv_term' + val inv_term' = update_attribute_type thy typ cid_long inv_term + val eq_inv_ty = typ --> HOLogic.boolT + val abs_term = Term.lambda (Free (instance_placeholderN, typ)) inv_term' in thy |> Named_Target.theory_map (define_cond bind eq_inv_ty abs_term) end fun add_doc_class_cmd overloaded (raw_params, binding) raw_parent raw_fieldsNdefaults reject_Atoms regexps invariants thy = - let + let val bind_pos = Binding.pos_of binding val name = Binding.name_of binding val ctxt = Proof_Context.init_global thy; val params = map (apsnd (Typedecl.read_constraint ctxt)) raw_params; val ctxt1 = fold (Variable.declare_typ o TFree) params ctxt; - fun cid thy = (* takes class synonyms into account *) - DOF_core.get_onto_class_name_global' name thy + fun markup2string s = String.concat (List.filter (fn c => c <> Symbol.DEL) + (Symbol.explode (YXML.content_of s))) + val name' = + case raw_parent of + NONE => DOF_core.default_cid + | SOME s => markup2string s |> (fn s => DOF_core.get_onto_class_name_global' s thy) + fun cid thy = DOF_core.get_onto_class_name_global name thy val (parent, ctxt2) = read_parent raw_parent ctxt1; - (* takes class synonyms into account *) - val parent' = parent |> Option.map (fn (x, y) => (x, DOF_core.get_onto_class_name_global' y thy)) - val parent_cid_long = if is_some parent' - then parent' |> the |> snd - else DOF_core.default_cid + val parent' = case parent of + NONE => NONE + | SOME (Ts, _) => SOME (Ts, name') val raw_fieldsNdefaults' = filter (fn((bi,_,_),_) => Binding.name_of bi <> traceN) raw_fieldsNdefaults val _ = if length raw_fieldsNdefaults' <> length raw_fieldsNdefaults @@ -3006,22 +3117,21 @@ fun add_doc_class_cmd overloaded (raw_params, binding) else () val raw_fieldsNdefaults'' = if null regexps then raw_fieldsNdefaults' - else trace_attr_ts::raw_fieldsNdefaults' - val (fields, terms, ctxt3) = read_fields raw_fieldsNdefaults'' ctxt2; - val fieldsNterms = (map (fn (a,b,_) => (a,b)) fields) ~~ terms + else trace_attr_ts::raw_fieldsNdefaults' + val (fields, parsed_terms, ctxt3) = read_fields raw_fieldsNdefaults'' ctxt2; + val fieldsNterms = (map (fn (a,b,_) => (a,b)) fields) ~~ parsed_terms val fieldsNterms' = map (fn ((x,y),z) => (x,y,z)) fieldsNterms val params' = map (Proof_Context.check_tfree ctxt3) params; fun check_n_filter thy (bind,ty,mf) = - case DOF_core.get_attribute_info parent_cid_long (Binding.name_of bind) thy of - NONE => SOME(bind,ty,mf) - | SOME{def_occurrence,long_name,typ,...} - => if ty = typ - then (warning("overriding attribute:" - ^ long_name - ^ " in doc class:" - ^ def_occurrence); - NONE) - else error("no overloading allowed.") + case DOF_core.get_attribute_info name' (Binding.name_of bind) thy of + NONE => SOME(bind,ty,mf) + | SOME{def_occurrence,long_name,typ,...} => + if ty = typ + then (warning("overriding attribute:" + ^ long_name + ^ " in doc class:" + ^ def_occurrence); NONE) + else error("no overloading allowed.") val record_fields = map_filter (check_n_filter thy) fields fun mk_eq_pair name = (Free (name, doc_class_rexp_T), doc_class_rexp_t name) |> mk_meta_eq @@ -3046,12 +3156,12 @@ fun add_doc_class_cmd overloaded (raw_params, binding) else add [DOF_core.tag_attr] invariants' {virtual=true}) |> (fn thy => OntoLinkParser.docitem_antiquotation binding (cid thy) thy) (* defines the ontology-checked text antiquotation to this document class *) - |> (fn thy => fold(define_inv (cid thy)) (invariants') thy) + |> (fn thy => fold(define_inv (params', (cid thy))) (invariants') thy) (* The function declare_ISA_class_accessor_and_check_instance uses a prefix because the class name is already bound to "doc_class Regular_Exp.rexp" constant by add_doc_class_cmd function *) - |> (fn thy => ISA_core.declare_ISA_class_accessor_and_check_instance (cid thy, bind_pos) thy) - |> (fn thy => ISA_core.declare_class_instances_annotation (cid thy, bind_pos) thy) + |> (fn thy => ISA_core.declare_ISA_class_accessor_and_check_instance (params', cid thy, bind_pos) thy) + |> (fn thy => ISA_core.declare_class_instances_annotation (params', cid thy, bind_pos) thy) end; diff --git a/Isabelle_DOF/thys/RegExpInterface.thy b/Isabelle_DOF/thys/RegExpInterface.thy index 660d175..6ad24f6 100644 --- a/Isabelle_DOF/thys/RegExpInterface.thy +++ b/Isabelle_DOF/thys/RegExpInterface.thy @@ -51,7 +51,13 @@ definition rep1 :: "'a rexp \ 'a rexp" ("\(_)\\<^sup definition opt :: "'a rexp \ 'a rexp" ("\(_)\") where "\A\ \ A || One" - + +find_consts name:"RegExpI*" + +ML\ +val t = Sign.syn_of \<^theory> +\ +print_syntax value "Star (Conc(Alt (Atom(CHR ''a'')) (Atom(CHR ''b''))) (Atom(CHR ''c'')))" text\or better equivalently:\ value "\(\CHR ''a''\ || \CHR ''b''\) ~~ \CHR ''c''\\\<^sup>*" @@ -240,6 +246,8 @@ no_notation Star ("\(_)\\<^sup>*" [0]100) no_notation Plus (infixr "||" 55) no_notation Times (infixr "~~" 60) no_notation Atom ("\_\" 65) +no_notation rep1 ("\(_)\\<^sup>+") +no_notation opt ("\(_)\") ML\ structure RegExpInterface_Notations = @@ -248,7 +256,9 @@ val Star = (\<^term>\Regular_Exp.Star\, Mixfix (Syntax.read_input " val Plus = (\<^term>\Regular_Exp.Plus\, Infixr (Syntax.read_input "||", 55, Position.no_range)) val Times = (\<^term>\Regular_Exp.Times\, Infixr (Syntax.read_input "~~", 60, Position.no_range)) val Atom = (\<^term>\Regular_Exp.Atom\, Mixfix (Syntax.read_input "\_\", [], 65, Position.no_range)) -val notations = [Star, Plus, Times, Atom] +val opt = (\<^term>\RegExpInterface.opt\, Mixfix (Syntax.read_input "\(_)\", [], 1000, Position.no_range)) +val rep1 = (\<^term>\RegExpInterface.rep1\, Mixfix (Syntax.read_input "\(_)\\<^sup>+", [], 1000, Position.no_range)) +val notations = [Star, Plus, Times, Atom, rep1, opt] end \ diff --git a/Isabelle_DOF/thys/manual/M_02_Background.thy b/Isabelle_DOF/thys/manual/M_02_Background.thy index 69cd237..68cdbf5 100644 --- a/Isabelle_DOF/thys/manual/M_02_Background.thy +++ b/Isabelle_DOF/thys/manual/M_02_Background.thy @@ -208,13 +208,13 @@ text\ in many features over-accomplishes the required features of \<^dof>. \ -figure*["fig:dof-ide",relative_width="95",file_src="''figures/cicm2018-combined.png''"]\ +figure*["fig_dof_ide",relative_width="95",file_src="''figures/cicm2018-combined.png''"]\ The \<^isadof> IDE (left) and the corresponding PDF (right), showing the first page of~@{cite "brucker.ea:isabelle-ontologies:2018"}.\ text\ We call the present implementation of \<^dof> on the Isabelle platform \<^isadof> . - @{figure "fig:dof-ide"} shows a screen-shot of an introductory paper on + @{figure "fig_dof_ide"} shows a screen-shot of an introductory paper on \<^isadof>~@{cite "brucker.ea:isabelle-ontologies:2018"}: the \<^isadof> PIDE can be seen on the left, while the generated presentation in PDF is shown on the right. diff --git a/Isabelle_DOF/thys/manual/M_03_GuidedTour.thy b/Isabelle_DOF/thys/manual/M_03_GuidedTour.thy index 1b0e167..859bd38 100644 --- a/Isabelle_DOF/thys/manual/M_03_GuidedTour.thy +++ b/Isabelle_DOF/thys/manual/M_03_GuidedTour.thy @@ -477,7 +477,7 @@ on the level of generated \<^verbatim>\.aux\-files, which are not n error-message and compiling it with a consistent bibtex usually makes disappear this behavior. \ -subsection*["using-term-aq"::technical, main_author = "Some @{author ''bu''}"] +subsection*["using_term_aq"::technical, main_author = "Some @{author ''bu''}"] \Using Term-Antiquotations\ text\The present version of \<^isadof> is the first version that supports the novel feature of @@ -577,11 +577,11 @@ term antiquotations: \ (*<*) -declare_reference*["subsec:onto-term-ctxt"::technical] +declare_reference*["subsec_onto_term_ctxt"::technical] (*>*) text\They are text-contexts equivalents to the \<^theory_text>\term*\ and \<^theory_text>\value*\ commands - for term-contexts introduced in @{technical (unchecked) \subsec:onto-term-ctxt\}\ + for term-contexts introduced in @{technical (unchecked) \subsec_onto_term_ctxt\}\ subsection\A Technical Report with Tight Checking\ text\An example of tight checking is a small programming manual to document programming trick diff --git a/Isabelle_DOF/thys/manual/M_04_RefMan.thy b/Isabelle_DOF/thys/manual/M_04_RefMan.thy index 1f78055..27d3313 100644 --- a/Isabelle_DOF/thys/manual/M_04_RefMan.thy +++ b/Isabelle_DOF/thys/manual/M_04_RefMan.thy @@ -164,7 +164,7 @@ text\ to and between ontological concepts. \ -subsection*["odl-manual0"::technical]\Some Isabelle/HOL Specification Constructs Revisited\ +subsection*["odl_manual0"::technical]\Some Isabelle/HOL Specification Constructs Revisited\ text\ As ODL is an extension of Isabelle/HOL, document class definitions can therefore be arbitrarily mixed with standard HOL specification constructs. To make this manual self-contained, we present @@ -231,7 +231,7 @@ corresponding type-name \<^boxed_theory_text>\0.foo\ is not. For th definition of a \<^boxed_theory_text>\doc_class\ reject problematic lexical overlaps.\ -subsection*["odl-manual1"::technical]\Defining Document Classes\ +subsection*["odl_manual1"::technical]\Defining Document Classes\ text\ A document class\<^bindex>\document class\ can be defined using the @{command "doc_class"} keyword: \<^item> \class_id\:\<^bindex>\class\_id@\class_id\\ a type-\name\ that has been introduced @@ -350,7 +350,7 @@ layout; these commands have to be wrapped into text\ \<^item> \obj_id\:\<^index>\obj\_id@\obj_id\\ (or \<^emph>\oid\\<^index>\oid!oid@\see obj_id\\ for short) a \<^emph>\name\ - as specified in @{technical \odl-manual0\}. + as specified in @{technical \odl_manual0\}. \<^item> \meta_args\ : \<^rail>\obj_id ('::' class_id) ((',' attribute '=' HOL_term) *) \ \<^item> \<^emph>\evaluator\: from @{cite "wenzel:isabelle-isar:2020"}, evaluation is tried first using ML, @@ -465,16 +465,16 @@ text*[b::B'_test']\\ term*\@{B'_test' \b\}\ -declare_reference*["text-elements-expls"::technical] +declare_reference*["text_elements_expls"::technical] (*>*) -subsection*["subsec:onto-term-ctxt"::technical]\Ontological Term-Contexts and their Management\ +subsection*["subsec_onto_term_ctxt"::technical]\Ontological Term-Contexts and their Management\ text\ \<^item> \annotated_term_element\ \<^rail>\ (@@{command "term*"} ('[' meta_args ']')? '\' HOL_term '\' | (@@{command "value*"} - | @@{command "assert*"}) \ ('[' meta_args ']')? ('[' evaluator ']')? '\' HOL_term '\' + | @@{command "assert*"}) \ ('[' evaluator ']')? ('[' meta_args ']')? '\' HOL_term '\' | (@@{command "definition*"}) ('[' meta_args ']')? ('... see ref manual') | (@@{command "lemma*"} | @@{command "theorem*"} | @@{command "corollary*"} @@ -503,9 +503,9 @@ for example). With the exception of the @{command "term*"}-command, the term-ant This expansion happens \<^emph>\before\ evaluation of the term, thus permitting executable HOL-functions to interact with meta-objects. The @{command "assert*"}-command allows for logical statements to be checked in the global context -(see @{technical (unchecked) \text-elements-expls\}). +(see @{technical (unchecked) \text_elements_expls\}). % TODO: -% Section reference @{docitem (unchecked) \text-elements-expls\} has not the right number +% Section reference @{docitem (unchecked) \text_elements_expls\} has not the right number This is particularly useful to explore formal definitions wrt. their border cases. For @{command "assert*"}, the evaluation of the term can be disabled with the \<^boxed_theory_text>\disable_assert_evaluation\ theory attribute: @@ -558,7 +558,7 @@ of this meta-object. The latter leads to a failure of the entire command. \ (*<*) -declare_reference*["sec:advanced"::technical] +declare_reference*["sec_advanced"::technical] (*>*) subsection\Status and Query Commands\ @@ -586,7 +586,7 @@ text\ The raw term will be available in the \input_term\ field of \<^theory_text>\print_doc_items\ output and, \<^item> \<^theory_text>\check_doc_global\ checks if all declared object references have been defined, all monitors are in a final state, and checks the final invariant - on all objects (cf. @{technical (unchecked) \sec:advanced\}) + on all objects (cf. @{technical (unchecked) \sec_advanced\}) \ subsection\Macros\ @@ -738,7 +738,7 @@ text\The command syntax follows the implicit convention to add a ``*'' to distinguish them from the (similar) standard Isabelle text-commands which are not ontology-aware.\ -subsection*["text-elements"::technical]\The Ontology \<^verbatim>\scholarly_paper\\ +subsection*["text_elements"::technical]\The Ontology \<^verbatim>\scholarly_paper\\ (*<*) ML\val toLaTeX = String.translate (fn c => if c = #"_" then "\\_" else String.implode[c])\ ML\writeln (DOF_core.print_doc_class_tree @@ -821,9 +821,9 @@ or \text*[\::example, main_author = "Some(@{author \bu\})"] \ \ \\} where \<^boxed_theory_text>\"''bu''"\ is a string presentation of the reference to the author -text element (see below in @{docitem (unchecked) \text-elements-expls\}). +text element (see below in @{docitem (unchecked) \text_elements_expls\}). % TODO: -% Section reference @{docitem (unchecked) \text-elements-expls\} has not the right number +% Section reference @{docitem (unchecked) \text_elements_expls\} has not the right number \ text\Some of these concepts were supported as command-abbreviations leading to the extension @@ -866,7 +866,7 @@ of Isabelle is its ability to handle both, and to establish links between both w Therefore, the formal assertion command has been integrated to capture some form of formal content.\ -subsubsection*["text-elements-expls"::example]\Examples\ +subsubsection*["text_elements_expls"::example]\Examples\ text\ While the default user interface for class definitions via the @@ -1018,9 +1018,9 @@ schemata: -section*["sec:advanced"::technical]\Advanced ODL Concepts\ +section*["sec_advanced"::technical]\Advanced ODL Concepts\ -subsection*["sec:example"::technical]\Example\ +subsection*["sec_example"::technical]\Example\ text\We assume in this section the following local ontology: @{boxed_theory_text [display]\ @@ -1089,11 +1089,11 @@ text\ \ (*<*) -declare_reference*["sec:monitors"::technical] -declare_reference*["sec:low_level_inv"::technical] +declare_reference*["sec_monitors"::technical] +declare_reference*["sec_low_level_inv"::technical] (*>*) -subsection*["sec:class_inv"::technical]\ODL Class Invariants\ +subsection*["sec_class_inv"::technical]\ODL Class Invariants\ text\ Ontological classes as described so far are too liberal in many situations. @@ -1144,7 +1144,7 @@ text\ Hence, the \<^boxed_theory_text>\inv1\ invariant is checked when the instance \<^boxed_theory_text>\testinv2\ is defined. - Now let's add some invariants to our example in \<^technical>\sec:example\. + Now let's add some invariants to our example in \<^technical>\sec_example\. For example, one would like to express that any instance of a \<^boxed_theory_text>\result\ class finally has a non-empty property list, if its \<^boxed_theory_text>\kind\ is \<^boxed_theory_text>\proof\, or that @@ -1178,22 +1178,22 @@ text\ declare[[invariants_checking_with_tactics = true]]\} There are still some limitations with this high-level syntax. For now, the high-level syntax does not support the checking of - specific monitor behaviors (see @{technical (unchecked) "sec:monitors"}). + specific monitor behaviors (see @{technical (unchecked) "sec_monitors"}). For example, one would like to delay a final error message till the closing of a monitor. For this use-case you can use low-level class invariants - (see @{technical (unchecked) "sec:low_level_inv"}). + (see @{technical (unchecked) "sec_low_level_inv"}). Also, for now, term-antiquotations can not be used in an invariant formula. \ -subsection*["sec:low_level_inv"::technical]\ODL Low-level Class Invariants\ +subsection*["sec_low_level_inv"::technical]\ODL Low-level Class Invariants\ text\ If one want to go over the limitations of the actual high-level syntax of the invariant, one can define a function using SML. A formulation, in SML, of the class-invariant \<^boxed_theory_text>\has_property\ - in \<^technical>\sec:class_inv\, defined in the supposedly \Low_Level_Syntax_Invariants\ theory + in \<^technical>\sec_class_inv\, defined in the supposedly \Low_Level_Syntax_Invariants\ theory (note the long name of the class), is straight-forward: @@ -1222,7 +1222,7 @@ val _ = Theory.setup (DOF_core.make_ml_invariant (check_result_inv, cid_long) \<^boxed_theory_text>\oid\ is bound to a variable here and can therefore not be statically expanded. \ -subsection*["sec:monitors"::technical]\ODL Monitors\ +subsection*["sec_monitors"::technical]\ODL Monitors\ text\ We call a document class with an \accepts_clause\ a \<^emph>\monitor\.\<^bindex>\monitor\ Syntactically, an \accepts_clause\\<^index>\accepts\_clause@\accepts_clause\\ contains a regular expression over class identifiers. @@ -1291,18 +1291,18 @@ text\ sections. For now, the high-level syntax of invariants does not support the checking of specific monitor behaviors like the one just described and you must use - the low-level class invariants (see @{technical "sec:low_level_inv"}). + the low-level class invariants (see @{technical "sec_low_level_inv"}). Low-level invariants checking can be set up to be triggered when opening a monitor, when closing a monitor, or both by using the \<^ML>\DOF_core.add_opening_ml_invariant\, \<^ML>\DOF_core.add_closing_ml_invariant\, or \<^ML>\DOF_core.add_ml_invariant\ commands respectively, to add the invariants to the theory context - (See @{technical "sec:low_level_inv"} for an example). + (See @{technical "sec_low_level_inv"} for an example). \ -subsection*["sec:queries_on_instances"::technical]\Queries On Instances\ +subsection*["sec_queries_on_instances"::technical]\Queries On Instances\ text\ Any class definition generates term antiquotations checking a class instance or @@ -1315,19 +1315,19 @@ text\ or to get the list of the authors of the instances of \introduction\, it suffices to treat this meta-data as usual: @{theory_text [display,indent=5, margin=70] \ -value*\map (result.property) @{result-instances}\ -value*\map (text_section.authored_by) @{introduction-instances}\ +value*\map (result.property) @{result_instances}\ +value*\map (text_section.authored_by) @{introduction_instances}\ \} In order to get the list of the instances of the class \myresult\ whose \evidence\ is a \proof\, one can use the command: @{theory_text [display,indent=5, margin=70] \ -value*\filter (\\. result.evidence \ = proof) @{result-instances}\ +value*\filter (\\. result.evidence \ = proof) @{result_instances}\ \} The list of the instances of the class \introduction\ whose \level\ > 1, can be filtered by: @{theory_text [display,indent=5, margin=70] \ value*\filter (\\. the (text_section.level \) > 1) - @{introduction-instances}\ + @{introduction_instances}\ \} \ @@ -1414,7 +1414,7 @@ text\ \ -section*["document-templates"::technical]\Defining Document Templates\ +section*["document_templates"::technical]\Defining Document Templates\ subsection\The Core Template\ text\ diff --git a/Isabelle_DOF/thys/manual/M_05_Implementation.thy b/Isabelle_DOF/thys/manual/M_05_Implementation.thy index fc1207a..b5dadd0 100644 --- a/Isabelle_DOF/thys/manual/M_05_Implementation.thy +++ b/Isabelle_DOF/thys/manual/M_05_Implementation.thy @@ -188,7 +188,7 @@ text\ section\Programming Class Invariants\ text\ - See \<^technical>\sec:low_level_inv\. + See \<^technical>\sec_low_level_inv\. \ section\Implementing Monitors\ @@ -203,7 +203,7 @@ text\ val next : automaton -> env -> cid -> automaton\} where \<^boxed_sml>\env\ is basically a map between internal automaton states and class-id's (\<^boxed_sml>\cid\'s). An automaton is said to be \<^emph>\enabled\ for a class-id, - iff it either occurs in its accept-set or its reject-set (see @{docitem "sec:monitors"}). During + iff it either occurs in its accept-set or its reject-set (see @{docitem "sec_monitors"}). During top-down document validation, whenever a text-element is encountered, it is checked if a monitor is \emph{enabled} for this class; in this case, the \<^boxed_sml>\next\-operation is executed. The transformed automaton recognizing the suffix is stored in \<^boxed_sml>\docobj_tab\ if @@ -228,7 +228,7 @@ text\ \expandafter\providekeycommand\csname isaDof.#1\endcsname}%\} The \<^LaTeX>-generator of \<^isadof> maps each \<^boxed_theory_text>\doc_item\ to an \<^LaTeX>-environment (recall - @{docitem "text-elements"}). As generic \<^boxed_theory_text>\doc_item\s are derived from the text element, + @{docitem "text_elements"}). As generic \<^boxed_theory_text>\doc_item\s are derived from the text element, the environment \inlineltx|isamarkuptext*| builds the core of \<^isadof>'s \<^LaTeX> implementation. \