forked from Isabelle_DOF/Isabelle_DOF
Compare commits
94 Commits
main
...
monitor-na
Author | SHA1 | Date |
---|---|---|
Nicolas Méric | 9089c55e2f | |
Nicolas Méric | c440f9628f | |
Nicolas Méric | 5b3086bbe5 | |
Nicolas Méric | 7c0d2cee55 | |
Nicolas Méric | 7c6150affa | |
Nicolas Méric | ad4ad52b4e | |
Nicolas Méric | ba8227e6ab | |
Nicolas Méric | 20b0af740d | |
Nicolas Méric | 1379f8a671 | |
Achim D. Brucker | 8fdaafa295 | |
Nicolas Méric | 8513f7d267 | |
Nicolas Méric | 2b1a9d009e | |
Nicolas Méric | cd758d2c44 | |
Nicolas Méric | 8496963fec | |
Nicolas Méric | 72d8000f7b | |
Nicolas Méric | 17ec11b297 | |
Nicolas Méric | a96e17abf3 | |
Nicolas Méric | 74b60e47d5 | |
Nicolas Méric | a42dd4ea6c | |
Nicolas Méric | b162a24749 | |
Nicolas Méric | a9432c7b52 | |
Nicolas Méric | 9f28d4949e | |
Nicolas Méric | 885c23a138 | |
Nicolas Méric | a589d4cd47 | |
Burkhart Wolff | e1f143d151 | |
Burkhart Wolff | fd60cf2312 | |
Nicolas Méric | 73dfcd6c1e | |
Nicolas Méric | c0afe1105e | |
Burkhart Wolff | e414b97afb | |
Nicolas Méric | 0b2d28b547 | |
Nicolas Méric | 37d7ed7d17 | |
Nicolas Méric | 312734afbd | |
Burkhart Wolff | 8cee80d78e | |
Makarius Wenzel | ec0d525426 | |
Makarius Wenzel | 791990039b | |
Makarius Wenzel | 78d61390fe | |
Makarius Wenzel | ffcf1f3240 | |
Makarius Wenzel | 5471d873a9 | |
Makarius Wenzel | df37250a00 | |
Makarius Wenzel | 185daeb577 | |
Makarius Wenzel | 8037fd15f2 | |
Makarius Wenzel | afcd78610b | |
Makarius Wenzel | b8a9ef5118 | |
Makarius Wenzel | a4e75c8b12 | |
Makarius Wenzel | d20e9ccd22 | |
Makarius Wenzel | f2ee5d3780 | |
Makarius Wenzel | 44cae2e631 | |
Makarius Wenzel | 7b2bf35353 | |
Makarius Wenzel | e8c7fa6018 | |
Makarius Wenzel | b12e61511d | |
Makarius Wenzel | 3cac42e6cb | |
Makarius Wenzel | aee8ba1df1 | |
Makarius Wenzel | d93e1383d4 | |
Makarius Wenzel | 3d5d1e7476 | |
Makarius Wenzel | 4264e7cd15 | |
Makarius Wenzel | 96f4077c53 | |
Makarius Wenzel | d7fb39d7eb | |
Makarius Wenzel | b95826962f | |
Makarius Wenzel | 912d4bb49e | |
Makarius Wenzel | a6c1a2baa4 | |
Makarius Wenzel | bb5963c6e2 | |
Makarius Wenzel | cc3e2a51a4 | |
Makarius Wenzel | 9e4e5b49eb | |
Makarius Wenzel | b65ecbdbef | |
Makarius Wenzel | 3be2225dcf | |
Makarius Wenzel | f44f0af01c | |
Makarius Wenzel | 9a11baf840 | |
Makarius Wenzel | 48c167aa23 | |
Makarius Wenzel | 700a9bbfee | |
Makarius Wenzel | 73299941ad | |
Makarius Wenzel | 5a8c438c41 | |
Makarius Wenzel | 7772c73aaa | |
Makarius Wenzel | ca18453043 | |
Makarius Wenzel | 1a122b1a87 | |
Makarius Wenzel | 47d95c467e | |
Makarius Wenzel | bf3085d4c0 | |
Makarius Wenzel | 068e6e0411 | |
Makarius Wenzel | 09e9980691 | |
Makarius Wenzel | 94ce3fdec2 | |
Makarius Wenzel | 44819bff02 | |
Makarius Wenzel | a6ab1e101e | |
Makarius Wenzel | c29ec9641a | |
Nicolas Méric | 06833aa190 | |
Nicolas Méric | 4f0c7e1e95 | |
Nicolas Méric | 0040949cf8 | |
Nicolas Méric | e68c332912 | |
Burkhart Wolff | b2c4f40161 | |
Burkhart Wolff | 309952e0ce | |
Burkhart Wolff | 830e1b440a | |
Burkhart Wolff | 2149db9efc | |
Burkhart Wolff | 1547ace64b | |
Burkhart Wolff | 39acd61dfd | |
Burkhart Wolff | 29770b17ee | |
Achim D. Brucker | b4f4048cff |
|
@ -7,9 +7,10 @@ pipeline:
|
|||
- export `isabelle getenv ISABELLE_HOME_USER`
|
||||
- mkdir -p $ISABELLE_HOME_USER/etc
|
||||
- echo "ISABELLE_PDFLATEX=\"$LATEX --file-line-error\"" >> $ISABELLE_HOME_USER/etc/settings
|
||||
- isabelle components -u `pwd`
|
||||
- isabelle build -D . -o browser_info
|
||||
- isabelle dof_mkroot DOF_test
|
||||
- isabelle components -u .
|
||||
- isabelle build -D . -o browser_info
|
||||
- isabelle dof_mkroot -q DOF_test
|
||||
- isabelle build -D DOF_test
|
||||
- cp -r $ISABELLE_HOME_USER/browser_info $ARTIFACT_DIR
|
||||
- cd $ARTIFACT_DIR
|
||||
|
|
|
@ -194,8 +194,8 @@ for i in $VARS; do
|
|||
export "$i"
|
||||
done
|
||||
|
||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL options -g dof_isabelle)"
|
||||
DOF_VERSION="$($ISABELLE_TOOL options -g dof_version)"
|
||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
|
||||
DOF_VERSION="$($ISABELLE_TOOL dof_param -b dof_version)"
|
||||
|
||||
ISABELLE_SHORT_VERSION=`echo $ISABELLE_VERSION | sed -e 's/:.*$//'`
|
||||
ISADOF_TAR="Isabelle_DOF-"$DOF_VERSION"_"$ISABELLE_SHORT_VERSION
|
||||
|
|
18
README.md
18
README.md
|
@ -10,10 +10,10 @@ online.](https://artifacts.logicalhacking.com/releases/Isabelle_DOF/Isabelle_DOF
|
|||
Isabelle/DOF has three major prerequisites:
|
||||
|
||||
* **Isabelle:** Isabelle/DOF requires [Isabelle
|
||||
2022](http://isabelle.in.tum.de/website-Isabelle2022/). Please download the
|
||||
2022](https://isabelle.in.tum.de/website-Isabelle2022/). Please download the
|
||||
Isabelle 2022 distribution for your operating system from the [Isabelle
|
||||
website](http://isabelle.in.tum.de/website-Isabelle2022/).
|
||||
* **AFP:** Isabelle/DOF requires two entries from the [Archive of Formal Proofs
|
||||
website](https://isabelle.in.tum.de/website-Isabelle2022/).
|
||||
* **AFP:** Isabelle/DOF requires several entries from the [Archive of Formal Proofs
|
||||
(AFP)](https://www.isa-afp.org/). Please install the AFP following the
|
||||
instructions given at <https://www.isa-afp.org/using.html>.
|
||||
* **LaTeX:** Isabelle/DOF requires a modern LaTeX installation, i.e., at least
|
||||
|
@ -29,7 +29,7 @@ this command already during the installation of the prerequisites, you can skip
|
|||
it now):
|
||||
|
||||
```console
|
||||
foo@bar:~$ isabelle components -u `pwd`
|
||||
foo@bar:~$ isabelle components -u .
|
||||
```
|
||||
|
||||
The final step for the installation is:
|
||||
|
@ -91,15 +91,19 @@ templates:
|
|||
|
||||
```console
|
||||
foo@bar:~$ isabelle dof_mkroot -h
|
||||
|
||||
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
||||
|
||||
Options are:
|
||||
-I init Mercurial repository and add generated files
|
||||
-h print help
|
||||
-n NAME alternative session name (default: directory base name)
|
||||
-o ONTOLOGY ontology (default: scholarly_paper)
|
||||
-t TEMPLATE tempalte (default: scrartcl)
|
||||
-o NAMES list of ontologies, separated by blanks
|
||||
(default: "technical_report scholarly_paper")
|
||||
-q quiet mode: less verbosity
|
||||
-t NAME template (default: "scrreprt-modern")
|
||||
|
||||
Prepare session root directory (default: current directory).
|
||||
Create session root directory for Isabelle/DOF (default: current directory).
|
||||
```
|
||||
|
||||
## Releases
|
||||
|
|
|
@ -4,6 +4,7 @@ no_build = false
|
|||
requirements = \
|
||||
env:ISABELLE_SCALA_JAR
|
||||
sources = \
|
||||
src/scala/dof.scala \
|
||||
src/scala/dof_document_build.scala \
|
||||
src/scala/dof_mkroot.scala \
|
||||
src/scala/dof_tools.scala
|
||||
|
|
30
etc/options
30
etc/options
|
@ -1,30 +0,0 @@
|
|||
(* :mode=isabelle-options: *)
|
||||
|
||||
section "Isabelle/DOF"
|
||||
|
||||
public option dof_template : string = "scrreprt-modern"
|
||||
-- "Default document template for Isabelle/DOF documents"
|
||||
|
||||
public option dof_ontologies : string = "Isabelle_DOF.technical_report Isabelle_DOF.scholarly_paper"
|
||||
-- "Isabelle/DOF ontologies (separated by blanks)"
|
||||
|
||||
option dof_version : string = "Unreleased"
|
||||
-- "Isabelle/DOF version"
|
||||
(* "Unreleased" for development, semantic version for releases *)
|
||||
|
||||
option dof_isabelle : string = "2022"
|
||||
option dof_afp : string = "afp-2022-10-27"
|
||||
|
||||
option dof_latest_version : string = "1.3.0"
|
||||
option dof_latest_isabelle : string = "Isabelle2021-1"
|
||||
option dof_latest_doi : string = "10.5281/zenodo.6810799"
|
||||
option dof_generic_doi : string = "10.5281/zenodo.3370482"
|
||||
|
||||
option dof_url : string = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF"
|
||||
-- "Isabelle/DOF source repository"
|
||||
|
||||
option dof_artifact_dir : string = "releases/Isabelle_DOF/Isabelle_DOF"
|
||||
-- "Isabelle/DOF release artifacts"
|
||||
|
||||
option dof_artifact_host : string = "artifacts.logicalhacking.com"
|
||||
|
|
@ -1,5 +1,4 @@
|
|||
# -*- shell-script -*- :mode=shellscript:
|
||||
|
||||
ISABELLE_DOF_HOME="$COMPONENT"
|
||||
ISABELLE_DOCS="$ISABELLE_DOF_HOME/doc":"$ISABELLE_DOCS"
|
||||
|
||||
ISABELLE_DOCS="$ISABELLE_DOF_HOME/doc:$ISABELLE_DOCS"
|
||||
|
|
|
@ -1,7 +1,5 @@
|
|||
session "mini_odo" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof,
|
||||
dof_ontologies = "Isabelle_DOF.technical_report Isabelle_DOF.cenelec_50128",
|
||||
dof_template = "Isabelle_DOF.scrreprt-modern"]
|
||||
options [document = pdf, document_output = "output", document_build = dof]
|
||||
sessions
|
||||
"Physical_Quantities"
|
||||
theories
|
||||
|
|
|
@ -19,6 +19,8 @@ imports
|
|||
"Isabelle_DOF.technical_report"
|
||||
"Physical_Quantities.SI" "Physical_Quantities.SI_Pretty"
|
||||
begin
|
||||
use_template "scrreprt-modern"
|
||||
use_ontology technical_report and CENELEC_50128
|
||||
declare[[strict_monitor_checking=true]]
|
||||
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
|
||||
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
|
||||
|
@ -294,7 +296,7 @@ and the global model parameters such as wheel diameter, the number of teeth per
|
|||
sampling frequency etc., we can infer the maximal time of service as well the maximum distance
|
||||
the device can measure. As an example configuration, choosing:
|
||||
|
||||
\<^item> \<^term>\<open>(1 *\<^sub>Q metre)::real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
|
||||
\<^item> \<^term>\<open>(1 *\<^sub>Q metre):: real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
|
||||
\<^item> \<^term>\<open>100 :: real\<close> for \<^term>\<open>tpw\<close> (teeth per wheel),
|
||||
\<^item> \<^term>\<open>80 *\<^sub>Q kmh :: real[m\<cdot>s\<^sup>-\<^sup>1]\<close> for \<^term>\<open>Speed\<^sub>M\<^sub>a\<^sub>x\<close>,
|
||||
\<^item> \<^term>\<open>14.4 *\<^sub>Q kHz :: real[s\<^sup>-\<^sup>1]\<close> for the sampling frequency,
|
||||
|
@ -654,9 +656,9 @@ text*[t10::test_result]
|
|||
test-execution via, \<^eg>, a makefile or specific calls to a test-environment or test-engine. \<close>
|
||||
|
||||
|
||||
text
|
||||
\<open> Finally some examples of references to doc-items, i.e. text-elements
|
||||
with declared meta-information and status. \<close>
|
||||
text \<open> Finally some examples of references to doc-items, i.e. text-elements
|
||||
with declared meta-information and status. \<close>
|
||||
|
||||
text \<open> As established by @{test_result (unchecked) \<open>t10\<close>},
|
||||
@{test_result (define) \<open>t10\<close>} \<close>
|
||||
text \<open> the @{test_result \<open>t10\<close>}
|
||||
|
|
|
@ -16,6 +16,9 @@ theory IsaDofApplications
|
|||
imports "Isabelle_DOF.scholarly_paper"
|
||||
begin
|
||||
|
||||
use_template "lncs"
|
||||
use_ontology "scholarly_paper"
|
||||
|
||||
open_monitor*[this::article]
|
||||
declare[[strict_monitor_checking=false]]
|
||||
|
||||
|
@ -71,7 +74,7 @@ abstract*[abs::abstract, keywordlist="[''Ontology'',''Ontological Modeling'',''I
|
|||
\<close>
|
||||
|
||||
section*[intro::introduction]\<open> Introduction \<close>
|
||||
text*[introtext::introduction]\<open>
|
||||
text*[introtext::introduction, level = "Some 1"]\<open>
|
||||
The linking of the \<^emph>\<open>formal\<close> to the \<^emph>\<open>informal\<close> is perhaps the
|
||||
most pervasive challenge in the digitization of knowledge and its
|
||||
propagation. This challenge incites numerous research efforts
|
||||
|
@ -120,7 +123,7 @@ declare_reference*[ontomod::text_section]
|
|||
declare_reference*[ontopide::text_section]
|
||||
declare_reference*[conclusion::text_section]
|
||||
(*>*)
|
||||
text*[plan::introduction]\<open> The plan of the paper is follows: we start by introducing the underlying
|
||||
text*[plan::introduction, level="Some 1"]\<open> The plan of the paper is follows: we start by introducing the underlying
|
||||
Isabelle system (@{text_section (unchecked) \<open>bgrnd\<close>}) followed by presenting the
|
||||
essentials of \<^isadof> and its ontology language (@{text_section (unchecked) \<open>isadof\<close>}).
|
||||
It follows @{text_section (unchecked) \<open>ontomod\<close>}, where we present three application
|
||||
|
@ -130,7 +133,7 @@ conclusions and discuss related work in @{text_section (unchecked) \<open>conclu
|
|||
|
||||
section*[bgrnd::text_section,main_author="Some(@{docitem ''bu''}::author)"]
|
||||
\<open> Background: The Isabelle System \<close>
|
||||
text*[background::introduction]\<open>
|
||||
text*[background::introduction, level="Some 1"]\<open>
|
||||
While Isabelle is widely perceived as an interactive theorem prover
|
||||
for HOL (Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we
|
||||
would like to emphasize the view that Isabelle is far more than that:
|
||||
|
@ -159,7 +162,7 @@ figure*[architecture::figure,relative_width="100",src="''figures/isabelle-archit
|
|||
asynchronous communication between the Isabelle system and
|
||||
the IDE (right-hand side). \<close>
|
||||
|
||||
text*[blug::introduction]\<open> The Isabelle system architecture shown in @{figure \<open>architecture\<close>}
|
||||
text*[blug::introduction, level="Some 1"]\<open> The Isabelle system architecture shown in @{figure \<open>architecture\<close>}
|
||||
comes with many layers, with Standard ML (SML) at the bottom layer as implementation
|
||||
language. The architecture actually foresees a \<^emph>\<open>Nano-Kernel\<close> (our terminology) which
|
||||
resides in the SML structure \<^ML_structure>\<open>Context\<close>. This structure provides a kind of container called
|
||||
|
@ -191,7 +194,7 @@ For the antiquotation \inlineisar+\at{value "fac 5"}+ we assume the usual defin
|
|||
\inlineisar+fac+ in HOL.
|
||||
\<close>
|
||||
|
||||
text*[anti]\<open> Thus, antiquotations can refer to formal content, can be type-checked before being
|
||||
text*[anti::introduction, level = "Some 1"]\<open> Thus, antiquotations can refer to formal content, can be type-checked before being
|
||||
displayed and can be used for calculations before actually being typeset. When editing,
|
||||
Isabelle's PIDE offers auto-completion and error-messages while typing the above
|
||||
\<^emph>\<open>semi-formal\<close> content. \<close>
|
||||
|
|
|
@ -1,7 +1,5 @@
|
|||
session "2018-cicm-isabelle_dof-applications" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof,
|
||||
dof_ontologies = "Isabelle_DOF.scholarly_paper", dof_template = Isabelle_DOF.lncs,
|
||||
quick_and_dirty = true]
|
||||
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
|
||||
theories
|
||||
IsaDofApplications
|
||||
document_files
|
||||
|
|
|
@ -1,6 +1,5 @@
|
|||
session "2020-iFM-csp" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof,
|
||||
dof_ontologies = "Isabelle_DOF.scholarly_paper", dof_template = Isabelle_DOF.scrartcl]
|
||||
options [document = pdf, document_output = "output", document_build = dof]
|
||||
theories
|
||||
"paper"
|
||||
document_files
|
||||
|
|
|
@ -6870,7 +6870,7 @@ isbn="978-3-540-48509-4"
|
|||
title = {{Isabelle's} Logic: {HOL}},
|
||||
author = {Tobias Nipkow and Lawrence C. Paulson and Markus Wenzel},
|
||||
year = 2009,
|
||||
misc = {\url{http://isabelle.in.tum.de/library/HOL/}}
|
||||
misc = {\url{https://isabelle.in.tum.de/library/HOL/}}
|
||||
}
|
||||
|
||||
@InProceedings{ garson.ea:security:2008,
|
||||
|
@ -11000,7 +11000,7 @@ isbn="978-1-4471-3182-3"
|
|||
journal = {Archive of Formal Proofs},
|
||||
month = apr,
|
||||
year = 2019,
|
||||
note = {\url{http://isa-afp.org/entries/HOL-CSP.html}},
|
||||
note = {\url{https://isa-afp.org/entries/HOL-CSP.html}},
|
||||
ISSN = {2150-914x},
|
||||
}
|
||||
|
||||
|
|
|
@ -3,6 +3,8 @@ theory "paper"
|
|||
imports "Isabelle_DOF.scholarly_paper"
|
||||
begin
|
||||
|
||||
use_template "scrartcl"
|
||||
use_ontology "scholarly_paper"
|
||||
|
||||
open_monitor*[this::article]
|
||||
|
||||
|
@ -50,7 +52,7 @@ abstract*[abs, keywordlist="[\<open>Shallow Embedding\<close>,\<open>Process-Alg
|
|||
\<close>
|
||||
text\<open>\<close>
|
||||
section*[introheader::introduction,main_author="Some(@{docitem ''bu''}::author)"]\<open> Introduction \<close>
|
||||
text*[introtext::introduction]\<open>
|
||||
text*[introtext::introduction, level="Some 1"]\<open>
|
||||
Communicating Sequential Processes (\<^csp>) is a language
|
||||
to specify and verify patterns of interaction of concurrent systems.
|
||||
Together with CCS and LOTOS, it belongs to the family of \<^emph>\<open>process algebras\<close>.
|
||||
|
@ -152,7 +154,7 @@ processes \<open>Skip\<close> (successful termination) and \<open>Stop\<close> (
|
|||
\<open>\<T>(Skip) = \<T>(Stop) = {[]}\<close>.
|
||||
Note that the trace sets, representing all \<^emph>\<open>partial\<close> history, is in general prefix closed.\<close>
|
||||
|
||||
text*[ex1::math_example, status=semiformal] \<open>
|
||||
text*[ex1::math_example, status=semiformal, level="Some 1"] \<open>
|
||||
Let two processes be defined as follows:
|
||||
|
||||
\<^enum> \<open>P\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<box> (b \<rightarrow> Stop)\<close>
|
||||
|
@ -352,7 +354,7 @@ Roscoe and Brooks @{cite "Roscoe1992AnAO"} finally proposed another ordering, ca
|
|||
that completeness could at least be assured for read-operations. This more complex ordering
|
||||
is based on the concept \<^emph>\<open>refusals after\<close> a trace \<open>s\<close> and defined by \<open>\<R> P s \<equiv> {X | (s, X) \<in> \<F> P}\<close>.\<close>
|
||||
|
||||
Definition*[process_ordering, short_name="''process ordering''"]\<open>
|
||||
Definition*[process_ordering, level= "Some 2", short_name="''process ordering''"]\<open>
|
||||
We define \<open>P \<sqsubseteq> Q \<equiv> \<psi>\<^sub>\<D> \<and> \<psi>\<^sub>\<R> \<and> \<psi>\<^sub>\<M> \<close>, where
|
||||
\<^enum> \<open>\<psi>\<^sub>\<D> = \<D> P \<supseteq> \<D> Q \<close>
|
||||
\<^enum> \<open>\<psi>\<^sub>\<R> = s \<notin> \<D> P \<Rightarrow> \<R> P s = \<R> Q s\<close>
|
||||
|
@ -528,10 +530,10 @@ To handle termination better, we added two new processes \<open>CHAOS\<^sub>S\<^
|
|||
\<close>
|
||||
|
||||
(*<*) (* a test ...*)
|
||||
text*[X22 ::math_content ]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||
text*[X32::"definition", mcc=defn]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X42]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X52::"definition"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
text*[X22 ::math_content, level="Some 2" ]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||
text*[X32::"definition", level="Some 2", mcc=defn]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X42, level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X52::"definition", level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
|
||||
text\<open> The \<open>RUN\<close>-process defined @{math_content X22} represents the process that accepts all
|
||||
events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in two variants shown in
|
||||
|
@ -539,11 +541,11 @@ events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in
|
|||
stops or accepts any offered event, whereas \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> can additionally terminate.\<close>
|
||||
(*>*)
|
||||
|
||||
Definition*[X2]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||
Definition*[X3]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X4]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close>\<close>
|
||||
Definition*[X5]\<open>\<open>DF A \<equiv> \<mu> X. (\<sqinter> x \<in> A \<rightarrow> X)\<close> \<close>
|
||||
Definition*[X6]\<open>\<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. ((\<sqinter> x \<in> A \<rightarrow> X) \<sqinter> SKIP)\<close> \<close>
|
||||
Definition*[X2, level="Some 2"]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||
Definition*[X3, level="Some 2"]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X4, level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close>\<close>
|
||||
Definition*[X5, level="Some 2"]\<open>\<open>DF A \<equiv> \<mu> X. (\<sqinter> x \<in> A \<rightarrow> X)\<close> \<close>
|
||||
Definition*[X6, level="Some 2"]\<open>\<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. ((\<sqinter> x \<in> A \<rightarrow> X) \<sqinter> SKIP)\<close> \<close>
|
||||
|
||||
text\<open>In the following, we denote \<open> \<R>\<P> = {DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P, DF, RUN, CHAOS, CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P}\<close>.
|
||||
All five reference processes are divergence-free.
|
||||
|
@ -605,16 +607,16 @@ handled separately. One contribution of our work is establish their precise rela
|
|||
the Failure/Divergence Semantics of \<^csp>.\<close>
|
||||
|
||||
(* bizarre: Definition* does not work for this single case *)
|
||||
text*[X10::"definition"]\<open> \<open>deadlock\<^sub>-free P \<equiv> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close> \<close>
|
||||
text*[X10::"definition", level="Some 2"]\<open> \<open>deadlock\<^sub>-free P \<equiv> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close> \<close>
|
||||
|
||||
text\<open>\<^noindent> A process \<open>P\<close> is deadlock-free if and only if after any trace \<open>s\<close> without \<open>\<surd>\<close>, the union of \<open>\<surd>\<close>
|
||||
and all events of \<open>P\<close> can never be a refusal set associated to \<open>s\<close>, which means that \<open>P\<close> cannot
|
||||
be deadlocked after any non-terminating trace.
|
||||
\<close>
|
||||
|
||||
Theorem*[T1, short_name="\<open>DF definition captures deadlock-freeness\<close>"]
|
||||
Theorem*[T1, short_name="\<open>DF definition captures deadlock-freeness\<close>", level="Some 2"]
|
||||
\<open> \hfill \break \<open>deadlock_free P \<longleftrightarrow> (\<forall>s\<in>\<T> P. tickFree s \<longrightarrow> (s, {\<surd>}\<union>events_of P) \<notin> \<F> P)\<close> \<close>
|
||||
Definition*[X11]\<open> \<open>livelock\<^sub>-free P \<equiv> \<D> P = {} \<close> \<close>
|
||||
Definition*[X11, level="Some 2"]\<open> \<open>livelock\<^sub>-free P \<equiv> \<D> P = {} \<close> \<close>
|
||||
|
||||
text\<open> Recall that all five reference processes are livelock-free.
|
||||
We also have the following lemmas about the
|
||||
|
@ -628,7 +630,7 @@ text\<open>
|
|||
Finally, we proved the following theorem that confirms the relationship between the two vital
|
||||
properties:
|
||||
\<close>
|
||||
Theorem*[T2, short_name="''DF implies LF''"]
|
||||
Theorem*[T2, short_name="''DF implies LF''", level="Some 2"]
|
||||
\<open> \<open>deadlock_free P \<longrightarrow> livelock_free P\<close> \<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -795,7 +797,7 @@ This normal form is closed under deterministic and communication operators.
|
|||
The advantage of this format is that we can mimick the well-known product automata construction
|
||||
for an arbitrary number of synchronized processes under normal form.
|
||||
We only show the case of the synchronous product of two processes: \<close>
|
||||
text*[T3::"theorem", short_name="\<open>Product Construction\<close>"]\<open>
|
||||
text*[T3::"theorem", short_name="\<open>Product Construction\<close>", level="Some 2"]\<open>
|
||||
Parallel composition translates to normal form:
|
||||
@{cartouche [display,indent=5]\<open>(P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>1,\<upsilon>\<^sub>1\<rbrakk> \<sigma>\<^sub>1) || (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>2,\<upsilon>\<^sub>2\<rbrakk> \<sigma>\<^sub>2) =
|
||||
P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2). \<tau>\<^sub>1 \<sigma>\<^sub>1 \<inter> \<tau>\<^sub>2 \<sigma>\<^sub>2 , \<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2).\<lambda>e.(\<upsilon>\<^sub>1 \<sigma>\<^sub>1 e, \<upsilon>\<^sub>2 \<sigma>\<^sub>2 e)\<rbrakk> (\<sigma>\<^sub>1,\<sigma>\<^sub>2)\<close>}
|
||||
|
@ -815,7 +817,7 @@ states via the closure \<open>\<RR>\<close>, which is defined inductively over:
|
|||
Thus, normalization leads to a new characterization of deadlock-freeness inspired
|
||||
from automata theory. We formally proved the following theorem:\<close>
|
||||
|
||||
text*[T4::"theorem", short_name="\<open>DF vs. Reacheability\<close>"]
|
||||
text*[T4::"theorem", short_name="\<open>DF vs. Reacheability\<close>", level="Some 2"]
|
||||
\<open> If each reachable state \<open>s \<in> (\<RR> \<tau> \<upsilon>)\<close> has outgoing transitions,
|
||||
the \<^csp> process is deadlock-free:
|
||||
@{cartouche [display,indent=10] \<open>\<forall>\<sigma> \<in> (\<RR> \<tau> \<upsilon> \<sigma>\<^sub>0). \<tau> \<sigma> \<noteq> {} \<Longrightarrow> deadlock_free (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>,\<upsilon>\<rbrakk> \<sigma>\<^sub>0)\<close>}
|
||||
|
|
|
@ -65,7 +65,7 @@ text\<open>
|
|||
\<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["fig:dependency"::text_section]
|
||||
declare_reference*["fig:dependency"::figure]
|
||||
(*>*)
|
||||
|
||||
|
||||
|
|
|
@ -159,9 +159,6 @@ replaced by built-in document templates.\<close> for users are:
|
|||
\<^LaTeX>-styles) need to be registered in this file. For details of Isabelle's build system, please
|
||||
consult the Isabelle System Manual~@{cite "wenzel:system-manual:2020"}. In addition to the
|
||||
standard features of, this file also contains \<^isadof> specific configurations:
|
||||
\<^item> \<^boxed_bash>\<open>dof_ontologies\<close> a list of (fully qualified) ontologies, separated by spaces, used
|
||||
by the project.
|
||||
\<^item> \<^boxed_bash>\<open>dof_template\<close> the (fully qualified) document template.
|
||||
\<^item> \<^boxed_bash>\<open>document_build=dof\<close> needs to be present, to tell Isabelle, to use the
|
||||
Isabelle/DOF backend for the document generation.
|
||||
\<^item> The file \<^boxed_bash>\<open>preamble.tex\<close>\<^index>\<open>preamble.tex\<close>, which allows users to add additional
|
||||
|
@ -186,12 +183,14 @@ Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
|||
|
||||
Options are:
|
||||
-I init Mercurial repository and add generated files
|
||||
-h print help
|
||||
-n NAME alternative session name (default: directory base name)
|
||||
-o ONTOLOGY ontology (default: scholarly_paper)
|
||||
-t TEMPLATE tempalte (default: scrartcl)
|
||||
|
||||
Prepare session root directory (default: current directory).\<close>}
|
||||
-o NAMES list of ontologies, separated by blanks
|
||||
(default: "technical_report scholarly_paper")
|
||||
-q quiet mode: less verbosity
|
||||
-t NAME template (default: "scrreprt-modern")
|
||||
|
||||
Create session root directory for Isabelle/DOF (default: current directory).\<close>}
|
||||
\<close>
|
||||
|
||||
section*[writing_doc::technical]\<open>Writing Documents: General Remarks\<close>
|
||||
|
@ -693,8 +692,29 @@ text\<open>There are options to display sub-parts of formulas etc., but it is a
|
|||
of tight-checking that the information must be given complete and exactly in the syntax of
|
||||
Isabelle. This may be over-precise and a burden to readers not familiar with Isabelle, which may
|
||||
motivate authors to choose the aforementioned freeform-style.
|
||||
|
||||
Additionally, documents antiquotations were added to check and evaluate terms with
|
||||
term antiquotations:
|
||||
\<^item> \<^theory_text>\<open>@{term_ \<open>term\<close> }\<close> parses and type-checks \<open>term\<close> with term antiquotations,
|
||||
for instance \<^theory_text>\<open>@{term_ \<open>@{cenelec-term \<open>FT\<close>}\<close>}\<close> will parse and check
|
||||
that \<open>FT\<close> is indeed an instance of the class \<^typ>\<open>cenelec_term\<close>,
|
||||
\<^item> \<^theory_text>\<open>@{value_ \<open>term\<close> }\<close> performs the evaluation of \<open>term\<close> with term antiquotations,
|
||||
for instance \<^theory_text>\<open>@{value_ \<open>mcc @{cenelec-term \<open>FT\<close>}\<close>}\<close>
|
||||
will print the value of the \<^const>\<open>mcc\<close> attribute of the instance \<open>FT\<close>.
|
||||
\<^theory_text>\<open>value_\<close> may have an optional argument between square brackets to specify the evaluator but this
|
||||
argument must be specified after a default optional argument already defined
|
||||
by the text antiquotation implementation.
|
||||
So one must use the following syntax if he does not want to specify the first optional argument:
|
||||
\<open>@{value_ [] [nbe] \<open>r @{F \<open>xcv4\<close>}\<close>}\<close>. Note the empty brackets.
|
||||
\<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["subsec:onto-term-ctxt"::technical]
|
||||
(*>*)
|
||||
|
||||
text\<open>They are text-contexts equivalents to the \<^theory_text>\<open>term*\<close> and \<^theory_text>\<open>value*\<close> commands
|
||||
for term-contexts introduced in @{technical (unchecked) \<open>subsec:onto-term-ctxt\<close>}\<close>
|
||||
|
||||
subsection\<open>A Technical Report with Tight Checking\<close>
|
||||
text\<open>An example of tight checking is a small programming manual developed by the
|
||||
second author in order to document programming trick discoveries while implementing in Isabelle.
|
||||
|
|
|
@ -235,8 +235,7 @@ A document class\<^bindex>\<open>document class\<close> can be defined using the
|
|||
We call document classes with an \<open>accepts_clause\<close>
|
||||
\<^emph>\<open>monitor classes\<close> or \<^emph>\<open>monitors\<close> for short.
|
||||
\<^rail>\<open> (@@{command "doc_class"}| @@{command "onto_class"}) class_id '=' (class_id '+')? (attribute_decl+) \<newline>
|
||||
(invariant_decl *)
|
||||
(accepts_clause rejects_clause?)?\<close>
|
||||
(invariant_decl *) (rejects_clause accepts_clause)? \<newline> (accepts_clause *)\<close>
|
||||
\<^item> \<open>attribute_decl\<close>:\<^index>\<open>attribute\_decl@\<open>attribute_decl\<close>\<close>
|
||||
\<^rail>\<open> name '::' '"' type '"' default_clause? \<close>
|
||||
\<^item> \<open>invariant_decl\<close>:\<^index>\<open>invariant\_decl@\<open>invariant_decl\<close>\<close>
|
||||
|
@ -245,10 +244,10 @@ A document class\<^bindex>\<open>document class\<close> can be defined using the
|
|||
disambiguate the argument of the expression
|
||||
and the \<^boxed_text>\<open>\<sigma>\<close> symbol is reserved to reference the instance of the class itself.
|
||||
\<^rail>\<open> 'invariant' (name '::' '"' term '"' + 'and') \<close>
|
||||
\<^item> \<open>accepts_clause\<close>:\<^index>\<open>accepts\_clause@\<open>accepts_clause\<close>\<close>
|
||||
\<^rail>\<open> 'accepts' '"' regexpr '"'\<close>
|
||||
\<^item> \<open>rejects_clause\<close>:\<^index>\<open>rejects\_clause@\<open>rejects_clause\<close>\<close>
|
||||
\<^rail>\<open> 'rejects' (class_id * ',') \<close>
|
||||
\<^item> \<open>accepts_clause\<close>:\<^index>\<open>accepts\_clause@\<open>accepts_clause\<close>\<close>
|
||||
\<^rail>\<open> 'accepts' ('"' regexpr '"' + 'and')\<close>
|
||||
\<^item> \<open>default_clause\<close>:\<^index>\<open>default\_clause@\<open>default_clause\<close>\<close>
|
||||
\<^rail>\<open> '<=' '"' expr '"' \<close>
|
||||
\<^item> \<open>regexpr\<close>:\<^index>\<open>regexpr@\<open>regexpr\<close>\<close>
|
||||
|
@ -335,10 +334,10 @@ is currently only available in the SML API's of the kernel.
|
|||
\<^item> \<open>annotated_text_element\<close> :
|
||||
\<^rail>\<open>
|
||||
( @@{command "text*"} '[' meta_args ']' '\<open>' formal_text '\<close>'
|
||||
| @@{command "ML*"} ('[' meta_args ']')? '\<open>' SML_code '\<close>'
|
||||
| @@{command "ML*"} '[' meta_args ']' '\<open>' SML_code '\<close>'
|
||||
| @@{command "term*"} ('[' meta_args ']')? '\<open>' HOL_term '\<close>'
|
||||
| @@{command "value*"} \<newline> ('[' meta_args ']' ('[' evaluator ']')?)? '\<open>' HOL_term '\<close>'
|
||||
| @@{command "assert*"} \<newline> ('[' meta_args ']' ('[' evaluator ']')?)? '\<open>' HOL_term '\<close>'
|
||||
| (@@{command "value*"}
|
||||
| @@{command "assert*"}) \<newline> ('[' meta_args ']')? ('[' evaluator ']')? '\<open>' HOL_term '\<close>'
|
||||
)
|
||||
\<close>
|
||||
\<^rail>\<open>
|
||||
|
@ -404,13 +403,24 @@ a referentiable meta-object of class \<^theory_text>\<open>cid\<close> is create
|
|||
The SML-code is type-checked and executed
|
||||
in the context of the SML toplevel of the Isabelle system as in the corresponding
|
||||
\<^theory_text>\<open>ML\<open> \<dots> SML-code \<dots> \<close>\<close>-command.
|
||||
Additionally, ML antiquotations were added to check and evaluate terms with
|
||||
term antiquotations:
|
||||
\<^item> \<^theory_text>\<open>@{term_ \<open>term\<close> }\<close> parses and type-checks \<open>term\<close> with term antiquotations,
|
||||
for instance \<^theory_text>\<open>@{term_ \<open>@{cenelec-term \<open>FT\<close>}\<close>}\<close> will parse and check
|
||||
that \<open>FT\<close> is indeed an instance of the class \<^typ>\<open>cenelec_term\<close>,
|
||||
\<^item> \<^theory_text>\<open>@{value_ \<open>term\<close> }\<close> performs the evaluation of \<open>term\<close> with term antiquotations,
|
||||
for instance \<^theory_text>\<open>@{value_ \<open>mcc @{cenelec-term \<open>FT\<close>}\<close>}\<close>
|
||||
will print the value of the \<^const>\<open>mcc\<close> attribute of the instance \<open>FT\<close>.
|
||||
\<^theory_text>\<open>value_\<close> may have an optional argument between square brackets to specify the evaluator:
|
||||
\<open>@{value_ [nbe] \<open>mcc @{cenelec-term \<open>FT\<close>}\<close>}\<close> forces \<open>value_\<close> to evaluate
|
||||
the term using normalization by evaluation (see @{cite "wenzel:isabelle-isar:2020"}).
|
||||
\<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["text-elements-expls"::technical]
|
||||
(*>*)
|
||||
|
||||
subsection\<open>Ontological Term-Contexts and their Management\<close>
|
||||
subsection*["subsec:onto-term-ctxt"::technical]\<open>Ontological Term-Contexts and their Management\<close>
|
||||
text\<open>The major commands providing term-contexts are
|
||||
\<^theory_text>\<open>term*[oid::cid, ...] \<open> \<dots> HOL-term \<dots> \<close>\<close>,
|
||||
\<^theory_text>\<open>value*[oid::cid, ...] \<open> \<dots> HOL-term \<dots> \<close>\<close> and
|
||||
|
@ -418,8 +428,9 @@ text\<open>The major commands providing term-contexts are
|
|||
Wrt. creation, track-ability and checking they are analogous to the ontological text and
|
||||
code-commands. However the argument terms may contain term-antiquotations stemming from an
|
||||
ontology definition. Both term-contexts were type-checked and \<^emph>\<open>validated\<close> against
|
||||
the global context (so: in the term \<open>@{A \<open>oid\<close>}\<close>, \<open>oid\<close> is indeed a string which refers
|
||||
to a meta-object belonging to the document class \<open>A\<close>, for example).
|
||||
the global context (so: in the term @{term_ \<open>@{author \<open>bu\<close>}\<close>}, \<open>bu\<close>
|
||||
is indeed a string which refers to a meta-object belonging
|
||||
to the document class \<^typ>\<open>author\<close>, for example).
|
||||
The term-context in the \<open>value*\<close>-command and \<^emph>\<open>assert*\<close>-command is additionally expanded
|
||||
(\<^eg> replaced) by a term denoting the meta-object.
|
||||
This expansion happens \<^emph>\<open>before\<close> evaluation of the term, thus permitting
|
||||
|
@ -434,7 +445,6 @@ Note unspecified attribute values were represented by free fresh variables which
|
|||
to choose either the normalization-by-evaluation strategy \<^theory_text>\<open>nbe\<close> or a proof attempt via
|
||||
the \<^theory_text>\<open>auto\<close> method. A failure of these strategies will be reported and regarded as non-validation
|
||||
of this meta-object. The latter leads to a failure of the entire command.
|
||||
|
||||
\<close>
|
||||
|
||||
(*<*)
|
||||
|
@ -448,7 +458,16 @@ text\<open>\<^isadof> provides a number of inspection commands.
|
|||
\<^item> \<^ML>\<open>DOF_core.print_doc_class_tree\<close> allows for presenting (fragments) of
|
||||
class-inheritance trees (currently only available at ML level),
|
||||
\<^item> \<^theory_text>\<open>print_doc_items\<close> allows to view the status of the internal
|
||||
object-table of text-elements that were tracked, and
|
||||
object-table of text-elements that were tracked.
|
||||
The theory attribute \<^theory_text>\<open>object_value_debug\<close> allows to inspect
|
||||
the term of instances value before its elaboration and normalization.
|
||||
Adding:
|
||||
@{boxed_theory_text [display]\<open>
|
||||
declare[[object_value_debug = true]]\<close>}
|
||||
... to the theory
|
||||
will add the raw value term to the instance object-table for all the subsequent
|
||||
declared instances (using \<^theory_text>\<open>text*\<close> for instance).
|
||||
The raw term will be available in the \<open>input_term\<close> field of \<^theory_text>\<open>print_doc_items\<close> output and,
|
||||
\<^item> \<^theory_text>\<open>check_doc_global\<close> checks if all declared object references have been
|
||||
defined, all monitors are in a final state, and checks the final invariant
|
||||
on all objects (cf. @{technical (unchecked) \<open>sec:advanced\<close>})
|
||||
|
@ -1040,11 +1059,19 @@ subsection*["sec:class_inv"::technical]\<open>ODL Class Invariants\<close>
|
|||
text\<open>
|
||||
Ontological classes as described so far are too liberal in many situations.
|
||||
There is a first high-level syntax implementation for class invariants.
|
||||
These invariants can be checked when an instance of the class is defined.
|
||||
To enable the checking of the invariants, the \<^boxed_theory_text>\<open>invariants_checking\<close>
|
||||
These invariants are checked when an instance of the class is defined,
|
||||
and trigger warnings.
|
||||
The checking is enabled by default but can be disabled with the
|
||||
\<^boxed_theory_text>\<open>invariants_checking\<close> theory attribute:
|
||||
@{boxed_theory_text [display]\<open>
|
||||
declare[[invariants_checking = false]]\<close>}
|
||||
|
||||
To enable the strict checking of the invariants,
|
||||
that is to trigger errors instead of warnings,
|
||||
the \<^boxed_theory_text>\<open>invariants_strict_checking\<close>
|
||||
theory attribute must be set:
|
||||
@{boxed_theory_text [display]\<open>
|
||||
declare[[invariants_checking = true]]\<close>}
|
||||
declare[[invariants_strict_checking = true]]\<close>}
|
||||
|
||||
For example, let's define the following two classes:
|
||||
@{boxed_theory_text [display]\<open>
|
||||
|
@ -1104,8 +1131,15 @@ text\<open>
|
|||
All specified constraints are already checked in the IDE of \<^dof> while editing.
|
||||
The invariant \<^boxed_theory_text>\<open>author_finite\<close> enforces that the user sets the
|
||||
\<^boxed_theory_text>\<open>authored_by\<close> set.
|
||||
The invariants \<^theory_text>\<open>author_finite\<close> and \<^theory_text>\<open>establish_defined\<close> can not be checked directly
|
||||
and need a little help.
|
||||
We can set the \<open>invariants_checking_with_tactics\<close> theory attribute to help the checking.
|
||||
It will enable a basic tactic, using unfold and auto:
|
||||
@{boxed_theory_text [display]\<open>
|
||||
declare[[invariants_checking_with_tactics = true]]\<close>}
|
||||
There are still some limitations with this high-level syntax.
|
||||
For now, the high-level syntax does not support monitors (see \<^technical>\<open>sec:monitors\<close>).
|
||||
For now, the high-level syntax does not support the checking of
|
||||
specific monitor behaviors (see \<^technical>\<open>sec:monitors\<close>).
|
||||
For example, one would like to delay a final error message till the
|
||||
closing of a monitor.
|
||||
For this use-case you can use low-level class invariants (see \<^technical>\<open>sec:low_level_inv\<close>).
|
||||
|
@ -1147,6 +1181,20 @@ text\<open>
|
|||
instances of \<open>S\<close>.
|
||||
\<close>
|
||||
text\<open>
|
||||
Should the specified ranges of admissible instances not be observed, warnings will be triggered.
|
||||
To forbid the violation of the specified ranges,
|
||||
one can enable the \<^boxed_theory_text>\<open>strict_monitor_checking\<close> theory attribute:
|
||||
@{boxed_theory_text [display]\<open>declare[[strict_monitor_checking = true]]\<close>}
|
||||
It is possible to enable the tracing of free classes occurring inside the scope of a monitor by
|
||||
enabling the \<^boxed_theory_text>\<open>free_class_in_monitor_checking\<close>
|
||||
theory attribute:
|
||||
@{boxed_theory_text [display]\<open>declare[[free_class_in_monitor_checking = true]]\<close>}
|
||||
Then a warning will be triggered when defining an instance of a free class
|
||||
inside the scope of a monitor.
|
||||
To forbid free classes inside the scope of a monitor, one can enable the
|
||||
\<^boxed_theory_text>\<open>free_class_in_monitor_strict_checking\<close> theory attribute:
|
||||
@{boxed_theory_text [display]\<open>declare[[free_class_in_monitor_strict_checking = true]]\<close>}
|
||||
|
||||
Monitored document sections can be nested and overlap; thus, it is
|
||||
possible to combine the effect of different monitors. For example, it
|
||||
would be possible to refine the \<^boxed_theory_text>\<open>example\<close> section by its own
|
||||
|
@ -1164,8 +1212,17 @@ text\<open>
|
|||
header delimiting the borders of its representation. Class invariants
|
||||
on monitors allow for specifying structural properties on document
|
||||
sections.
|
||||
For now, the high-level syntax of invariants is not supported for monitors and you must use
|
||||
the low-level class invariants (see \<^technical>\<open>sec:low_level_inv\<close>.\<close>
|
||||
For now, the high-level syntax of invariants does not support the checking of
|
||||
specific monitor behaviors like the one just described and you must use
|
||||
the low-level class invariants (see \<^technical>\<open>sec:low_level_inv\<close>).
|
||||
|
||||
Low-level invariants checking can be set up to be triggered
|
||||
when opening a monitor, when closing a monitor, or both
|
||||
by using the \<^ML>\<open>DOF_core.update_class_eager_invariant\<close>,
|
||||
\<^ML>\<open>DOF_core.update_class_lazy_invariant\<close>, or \<^ML>\<open>DOF_core.update_class_invariant\<close> commands
|
||||
respectively, to add the invariants to the theory context
|
||||
(See \<^technical>\<open>sec:low_level_inv\<close> for an example).
|
||||
\<close>
|
||||
|
||||
|
||||
subsection*["sec:low_level_inv"::technical]\<open>ODL Low-level Class Invariants\<close>
|
||||
|
@ -1182,9 +1239,9 @@ text\<open>
|
|||
fun check_result_inv oid {is_monitor:bool} ctxt =
|
||||
let
|
||||
val kind =
|
||||
AttributeAccess.compute_attr_access ctxt "evidence" oid <@>{here} <@>{here}
|
||||
ISA_core.compute_attr_access ctxt "evidence" oid NONE <@>{here}
|
||||
val prop =
|
||||
AttributeAccess.compute_attr_access ctxt "property" oid <@>{here} <@>{here}
|
||||
ISA_core.compute_attr_access ctxt "property" oid NONE <@>{here}
|
||||
val tS = HOLogic.dest_list prop
|
||||
in case kind of
|
||||
<@>{term "proof"} => if not(null tS) then true
|
||||
|
@ -1233,7 +1290,8 @@ value*\<open>filter (\<lambda>\<sigma>. the (text_section.level \<sigma>) > 1)
|
|||
|
||||
section*[infrastructure::technical]\<open>Technical Infrastructure\<close>
|
||||
|
||||
text\<open>
|
||||
text\<open>
|
||||
%FIXME update story concerning "list"
|
||||
The list of fully supported (\<^ie>, supporting both interactive ontological modeling and
|
||||
document generation) ontologies and the list of supported document templates can be
|
||||
obtained by calling \<^boxed_bash>\<open>isabelle dof_mkroot -h\<close> (see \<^technical>\<open>first_project\<close>).
|
||||
|
@ -1320,6 +1378,7 @@ text\<open>
|
|||
text\<open>
|
||||
Developing a new document template ``\<^boxed_bash>\<open>bar\<close>'' requires the following steps:
|
||||
\<^item> develop a new \<^LaTeX>-template \<^boxed_bash>\<open>src/document-templates/root-bar.tex\<close>
|
||||
\<^item> add a suitable \<^theory_text>\<open>define_template\<close> command to theory \<^theory>\<open>Isabelle_DOF.Isa_DOF\<close>.
|
||||
\<^item> activation of the new document template by executing the \<^boxed_bash>\<open>install\<close> script. You can skip the lengthy
|
||||
checks for the AFP entries and the installation of the Isabelle patch by using the
|
||||
\<^boxed_bash>\<open>--skip-afp\<close> option:
|
||||
|
|
|
@ -53,7 +53,6 @@ text\<open>
|
|||
\<open>structure Data = Generic_Data
|
||||
( type T = docobj_tab * docclass_tab * ...
|
||||
val empty = (initial_docobj_tab, initial_docclass_tab, ...)
|
||||
val extend = I
|
||||
fun merge((d1,c1,...),(d2,c2,...)) = (merge_docobj_tab (d1,d2,...),
|
||||
merge_docclass_tab(c1,c2,...))
|
||||
);\<close>}
|
||||
|
|
|
@ -15,6 +15,8 @@
|
|||
theory "Isabelle_DOF-Manual"
|
||||
imports "05_Implementation"
|
||||
begin
|
||||
use_template "scrreprt-modern"
|
||||
use_ontology "technical_report" and "CENELEC_50128"
|
||||
close_monitor*[this]
|
||||
check_doc_global
|
||||
text\<open>Resulting trace in \<^verbatim>\<open>doc_item\<close> ''this'': \<close>
|
||||
|
|
|
@ -1,7 +1,5 @@
|
|||
session "Isabelle_DOF-Manual" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof,
|
||||
dof_ontologies = "Isabelle_DOF.technical_report Isabelle_DOF.cenelec_50128", dof_template = "Isabelle_DOF.scrreprt-modern",
|
||||
quick_and_dirty = true]
|
||||
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
|
||||
theories
|
||||
"Isabelle_DOF-Manual"
|
||||
document_files
|
||||
|
|
|
@ -451,7 +451,7 @@
|
|||
journal = {Archive of Formal Proofs},
|
||||
month = may,
|
||||
year = 2010,
|
||||
note = {\url{http://isa-afp.org/entries/Regular-Sets.html}, Formal
|
||||
note = {\url{https://isa-afp.org/entries/Regular-Sets.html}, Formal
|
||||
proof development},
|
||||
issn = {2150-914x}
|
||||
}
|
||||
|
@ -462,7 +462,7 @@
|
|||
journal = {Archive of Formal Proofs},
|
||||
month = mar,
|
||||
year = 2004,
|
||||
note = {\url{http://isa-afp.org/entries/Functional-Automata.html},
|
||||
note = {\url{https://isa-afp.org/entries/Functional-Automata.html},
|
||||
Formal proof development},
|
||||
issn = {2150-914x}
|
||||
}
|
||||
|
|
|
@ -1,7 +1,5 @@
|
|||
session "TR_MyCommentedIsabelle" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof,
|
||||
dof_ontologies = "Isabelle_DOF.technical_report", dof_template = Isabelle_DOF.scrreprt,
|
||||
quick_and_dirty = true]
|
||||
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
|
||||
theories
|
||||
"TR_MyCommentedIsabelle"
|
||||
document_files
|
||||
|
|
|
@ -14,9 +14,11 @@
|
|||
(*<*)
|
||||
theory TR_MyCommentedIsabelle
|
||||
imports "Isabelle_DOF.technical_report"
|
||||
|
||||
begin
|
||||
|
||||
use_template "scrreprt"
|
||||
use_ontology "technical_report"
|
||||
|
||||
define_shortcut* isabelle \<rightleftharpoons> \<open>Isabelle/HOL\<close>
|
||||
|
||||
open_monitor*[this::report]
|
||||
|
@ -62,7 +64,7 @@ text\<open> \<^vs>\<open>-0.5cm\<close>
|
|||
Isabelle and Isabelle/HOL, a complementary text to the unfortunately somewhat outdated
|
||||
"The Isabelle Cookbook" in \<^url>\<open>https://nms.kcl.ac.uk/christian.urban/Cookbook/\<close>.
|
||||
The present text is also complementary to the current version of
|
||||
\<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>
|
||||
\<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>
|
||||
"The Isabelle/Isar Implementation" by Makarius Wenzel in that it focusses on subjects
|
||||
not covered there, or presents alternative explanations for which I believe, based on my
|
||||
experiences with students and Phds, that they are helpful.
|
||||
|
@ -364,7 +366,7 @@ subsection*[t213::example]\<open>Mechanism 2 : Extending the Global Context \<op
|
|||
|
||||
text\<open>A central mechanism for constructing user-defined data is by the \<^ML_functor>\<open>Generic_Data\<close>-functor.
|
||||
A plugin needing some data \<^verbatim>\<open>T\<close> and providing it with implementations for an
|
||||
\<^verbatim>\<open>empty\<close>, and operations \<^verbatim>\<open>merge\<close> and \<^verbatim>\<open>extend\<close>, can construct a lense with operations
|
||||
\<^verbatim>\<open>empty\<close>, and operation \<^verbatim>\<open>merge\<close>, can construct a lense with operations
|
||||
\<^verbatim>\<open>get\<close> and \<^verbatim>\<open>put\<close> that attach this data into the generic system context. Rather than using
|
||||
unsynchronized SML mutable variables, this is the mechanism to introduce component local
|
||||
data in Isabelle, which allows to manage this data for the necessary backtrack and synchronization
|
||||
|
@ -373,14 +375,12 @@ text\<open>A central mechanism for constructing user-defined data is by the \<^M
|
|||
ML \<open>
|
||||
datatype X = mt
|
||||
val init = mt;
|
||||
val ext = I
|
||||
fun merge (X,Y) = mt
|
||||
|
||||
structure Data = Generic_Data
|
||||
(
|
||||
type T = X
|
||||
val empty = init
|
||||
val extend = ext
|
||||
val merge = merge
|
||||
);
|
||||
\<close>
|
||||
|
|
|
@ -151,19 +151,19 @@ fi
|
|||
|
||||
ACTUAL_ISABELLE_VERSION=`$ISABELLE_TOOL version`
|
||||
|
||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL options -g dof_isabelle)"
|
||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
|
||||
|
||||
if [ ${ISABELLE_VERSION} = "Isabelle" ];
|
||||
then
|
||||
echo "Error: cannot find Isabelle/DOF configuration, please check that you"
|
||||
echo " registered Isabelle/DOF as an Isabelle component, e.g., using"
|
||||
echo " isabelle components -u $PWD"
|
||||
echo " isabelle components -u ."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
|
||||
|
||||
AFP_DATE="$($ISABELLE_TOOL options -g dof_afp)"
|
||||
AFP_DATE="$($ISABELLE_TOOL dof_param -b afp_version)"
|
||||
AFP_URL="https://www.isa-afp.org/release/"$AFP_DATE".tar.gz"
|
||||
|
||||
echo ""
|
||||
|
|
|
@ -125,14 +125,14 @@ fun heading_command (name, pos) descr level =
|
|||
Monitor_Command_Parser.document_command (name, pos) descr
|
||||
{markdown = false, body = true} (enriched_text_element_cmd level);
|
||||
|
||||
val _ = heading_command ("title*", @{here}) "section heading" NONE;
|
||||
val _ = heading_command ("subtitle*", @{here}) "section heading" NONE;
|
||||
val _ = heading_command ("chapter*", @{here}) "section heading" (SOME (SOME 0));
|
||||
val _ = heading_command ("section*", @{here}) "section heading" (SOME (SOME 1));
|
||||
val _ = heading_command ("subsection*", @{here}) "subsection heading" (SOME (SOME 2));
|
||||
val _ = heading_command ("subsubsection*", @{here}) "subsubsection heading" (SOME (SOME 3));
|
||||
val _ = heading_command ("paragraph*", @{here}) "paragraph heading" (SOME (SOME 4));
|
||||
val _ = heading_command ("subparagraph*", @{here}) "subparagraph heading" (SOME (SOME 5));
|
||||
val _ = heading_command \<^command_keyword>\<open>title*\<close> "section heading" NONE;
|
||||
val _ = heading_command \<^command_keyword>\<open>subtitle*\<close> "section heading" NONE;
|
||||
val _ = heading_command \<^command_keyword>\<open>chapter*\<close> "section heading" (SOME (SOME 0));
|
||||
val _ = heading_command \<^command_keyword>\<open>section*\<close> "section heading" (SOME (SOME 1));
|
||||
val _ = heading_command \<^command_keyword>\<open>subsection*\<close> "subsection heading" (SOME (SOME 2));
|
||||
val _ = heading_command \<^command_keyword>\<open>subsubsection*\<close> "subsubsection heading" (SOME (SOME 3));
|
||||
val _ = heading_command \<^command_keyword>\<open>paragraph*\<close> "paragraph heading" (SOME (SOME 4));
|
||||
val _ = heading_command \<^command_keyword>\<open>subparagraph*\<close> "subparagraph heading" (SOME (SOME 5));
|
||||
|
||||
end
|
||||
end
|
||||
|
@ -157,6 +157,9 @@ doc_class figure =
|
|||
placement :: placement
|
||||
spawn_columns :: bool <= True
|
||||
|
||||
doc_class figure2 = figure +
|
||||
caption :: string
|
||||
|
||||
|
||||
doc_class side_by_side_figure = figure +
|
||||
anchor :: "string"
|
||||
|
@ -235,8 +238,8 @@ ML\<open>
|
|||
(* Ontological Macro Command Support *)
|
||||
(* *********************************************************************** *)
|
||||
|
||||
val _ = Onto_Macros.heading_command ("figure*", @{here}) "figure" NONE;
|
||||
val _ = Onto_Macros.heading_command ("side_by_side_figure*", @{here}) "multiple figures" NONE;
|
||||
val _ = Onto_Macros.heading_command \<^command_keyword>\<open>figure*\<close> "figure" NONE;
|
||||
val _ = Onto_Macros.heading_command \<^command_keyword>\<open>side_by_side_figure*\<close> "multiple figures" NONE;
|
||||
\<close>
|
||||
|
||||
(*<*)
|
||||
|
@ -251,6 +254,99 @@ setup\<open>\<close>
|
|||
*)
|
||||
(*>*)
|
||||
|
||||
subsubsection\<open>Figure Content\<close>
|
||||
text\<open>The intermediate development goal is to separate the ontological, top-level construct
|
||||
\<open>figure*\<close>, which will remain a referentiable, ontological document unit, from the more versatile
|
||||
\<^emph>\<open>import\<close> of a figure. The hope is that this opens the way for more orthogonality and
|
||||
abstraction from the LaTeX engine.
|
||||
\<close>
|
||||
|
||||
ML\<open>
|
||||
|
||||
type fig_content = {relative_width : int, (* percent of textwidth, default 100 *)
|
||||
scale : int, (* percent, default 100 *)
|
||||
caption : Input.source (* default empty *)}
|
||||
|
||||
val mt_fig_content = {relative_width = 100,
|
||||
scale = 100,
|
||||
caption = Input.empty }: fig_content
|
||||
|
||||
(* doof wie 100 m feldweg. *)
|
||||
fun upd_relative_width key {relative_width,scale,caption } : fig_content =
|
||||
{relative_width = key,scale = scale,caption = caption}: fig_content
|
||||
|
||||
fun upd_scale key {relative_width,scale,caption } : fig_content =
|
||||
{relative_width = relative_width,scale = key,caption = caption}: fig_content
|
||||
|
||||
fun upd_caption key {relative_width,scale,caption} : fig_content =
|
||||
{relative_width = relative_width,scale = scale,caption= key}: fig_content
|
||||
|
||||
|
||||
val widthN = "width"
|
||||
val scaleN = "scale"
|
||||
val captionN = "caption";
|
||||
|
||||
fun fig_content_modes (ctxt, toks) =
|
||||
let val (y, toks') = ((((Scan.optional
|
||||
(Args.parens
|
||||
(Parse.list1
|
||||
( (Args.$$$ widthN |-- Args.$$$ "=" -- Parse.int
|
||||
>> (fn (_, k) => upd_relative_width k))
|
||||
|| (Args.$$$ scaleN |-- Args.$$$ "=" -- Parse.int
|
||||
>> (fn (_, k) => upd_scale k))
|
||||
|| (Args.$$$ captionN |-- Args.$$$ "=" -- Parse.document_source
|
||||
>> (fn (_, k) => upd_caption k))
|
||||
))) [K mt_fig_content])
|
||||
: (fig_content -> fig_content) list parser)
|
||||
>> (foldl1 (op #>)))
|
||||
: (fig_content -> fig_content) parser)
|
||||
(toks)
|
||||
in (y, (ctxt, toks')) end
|
||||
|
||||
fun document_antiq (check: Proof.context -> Path.T option -> Input.source -> Path.T) =
|
||||
Args.context -- Scan.lift Parse.path_input >> (fn (ctxt, source) =>
|
||||
(check ctxt NONE source;
|
||||
Latex.string (Latex.output_ascii_breakable "/" (Input.string_of source))
|
||||
|> Latex.macro "isatt"));
|
||||
|
||||
|
||||
fun fig_content_antiquotation name scan =
|
||||
(Document_Output.antiquotation_raw_embedded name
|
||||
(scan : ((fig_content -> fig_content) * Input.source) context_parser)
|
||||
(fn ctxt =>
|
||||
(fn (cfg_trans,file:Input.source) =>
|
||||
let val {relative_width,scale,caption} = cfg_trans mt_fig_content
|
||||
val _ = if relative_width < 0 orelse scale<0 then error("negative parameter.")
|
||||
else ()
|
||||
val wdth_s = if relative_width = 100 then ""
|
||||
else "width="^Real.toString((Real.fromInt relative_width)
|
||||
/ (Real.fromInt 100))^"\textwidth"
|
||||
val scale_s= if scale = 100 then ""
|
||||
else "scale="^Real.toString((Real.fromInt scale) / (Real.fromInt 100))
|
||||
val arg = enclose "[" "]" (commas [wdth_s,scale_s])
|
||||
val lab = Document_Output.output_document ctxt {markdown = false} caption
|
||||
val path = Resources.check_file ctxt NONE file
|
||||
val _ = writeln("file "^Path.file_name path)
|
||||
(* ToDo: must be declared source of type png or jpeg or pdf, ... *)
|
||||
in file
|
||||
|> (Latex.string o Input.string_of)
|
||||
|> (XML.enclose ("\\includegraphics"^arg^"{") "}")
|
||||
|> (fn X => X @ Latex.macro "capture" lab)
|
||||
end
|
||||
)
|
||||
));
|
||||
|
||||
val _ = fig_content_antiquotation
|
||||
: binding
|
||||
-> ((fig_content -> fig_content) * Input.source) context_parser
|
||||
-> theory -> theory
|
||||
|
||||
val _ = Theory.setup
|
||||
( fig_content_antiquotation \<^binding>\<open>fig_content\<close>
|
||||
(fig_content_modes -- Scan.lift(Parse.path_input)))
|
||||
|
||||
\<close>
|
||||
|
||||
subsection\<open>Tables\<close>
|
||||
(* TODO ! ! ! *)
|
||||
(* dito the future monitor: table - block *)
|
||||
|
@ -260,7 +356,7 @@ text\<open>Tables are (sub) document-elements represented inside the documentati
|
|||
language. The used technology is similar to the existing railroad-diagram support
|
||||
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, Sec. 4.5).
|
||||
|
||||
However, tables are not directly based on the ideosynchrasies of Knuth-based language design ---
|
||||
However, tables are not directly based on the idiosyncrasies of Knuth-based language design ---
|
||||
|
||||
However, tables come with a more abstract structure model than conventional typesetting in the
|
||||
LaTeX tradition. It is based of the following principles:
|
||||
|
@ -356,7 +452,7 @@ fun upd_cell_line_width num
|
|||
cell_line_color = cell_line_color, cell_line_width = cell_line_width@[num] }
|
||||
: cell_config
|
||||
|
||||
|
||||
(*global default configs *)
|
||||
val (tab_cell_placing, tab_cell_placing_setup)
|
||||
= Attrib.config_string \<^binding>\<open>tab_cell_placing\<close> (K "center");
|
||||
val (tab_cell_height, tab_cell_height_setup)
|
||||
|
@ -387,6 +483,8 @@ val _ = Theory.setup( tab_cell_placing_setup
|
|||
#> tab_cell_line_width_setup
|
||||
)
|
||||
|
||||
|
||||
(*syntax for local tab specifier *)
|
||||
val cell_placingN = "cell_placing"
|
||||
val cell_heightN = "cell_height"
|
||||
val cell_widthN = "cell_width"
|
||||
|
@ -396,7 +494,8 @@ val cell_line_widthN = "cell_line_width"
|
|||
|
||||
val placing_scan = Args.$$$ "left" || Args.$$$ "center" || Args.$$$ "right"
|
||||
|
||||
val color_scan = Args.$$$ "none" || Args.$$$ "red" || Args.$$$ "green" || Args.$$$ "blue" || Args.$$$ "black"
|
||||
val color_scan = Args.$$$ "none" || Args.$$$ "red" || Args.$$$ "green"
|
||||
|| Args.$$$ "blue" || Args.$$$ "black"
|
||||
|
||||
(*
|
||||
|
||||
|
@ -516,7 +615,7 @@ val _ = Theory.setup
|
|||
end
|
||||
\<close>
|
||||
|
||||
|
||||
text\<open> @{file "../ROOT"} \<close>
|
||||
define_shortcut* clearpage \<rightleftharpoons> \<open>\clearpage{}\<close>
|
||||
hf \<rightleftharpoons> \<open>\hfill\<close>
|
||||
br \<rightleftharpoons> \<open>\break\<close>
|
||||
|
@ -526,6 +625,9 @@ declare[[tab_cell_placing="left",tab_cell_height="18.0cm"]]
|
|||
|
||||
section\<open>Tests\<close>
|
||||
(*<*)
|
||||
|
||||
text\<open> @{fig_content [display] (scale = 80, width=80, caption=\<open>this is \<^term>\<open>\<sigma>\<^sub>i+2\<close> \<dots>\<close>)
|
||||
\<open>../../examples/scholarly_paper/2018-cicm-isabelle_dof-applications/document/figures/isabelle-architecture.pdf\<close>}\<close>
|
||||
text\<open> @{table_inline [display] (cell_placing = center,cell_height =\<open>12.0cm\<close>,
|
||||
cell_height =\<open>13pt\<close>, cell_width = \<open>12.0cm\<close>,
|
||||
cell_bgnd_color=black,cell_line_color=red,cell_line_width=\<open>12.0cm\<close>)
|
||||
|
|
1659
src/DOF/Isa_DOF.thy
1659
src/DOF/Isa_DOF.thy
File diff suppressed because it is too large
Load Diff
|
@ -178,7 +178,7 @@ local open RegExpChecker in
|
|||
|
||||
type automaton = state * ((Int.int -> state -> state) * (state -> bool))
|
||||
|
||||
val add_atom = fold_aterms (fn Const(c as(_,Type(@{type_name "rexp"},_)))=> insert (op=) c |_=>I);
|
||||
val add_atom = fold_aterms (fn Const (c as (_, \<^Type>\<open>rexp _\<close>)) => insert (op=) c | _=> I);
|
||||
fun alphabet termS = rev(map fst (fold add_atom termS []));
|
||||
fun ext_alphabet env termS =
|
||||
let val res = rev(map fst (fold add_atom termS [])) @ env;
|
||||
|
@ -187,14 +187,14 @@ local open RegExpChecker in
|
|||
else ()
|
||||
in res end;
|
||||
|
||||
fun conv (Const(@{const_name "Regular_Exp.rexp.Zero"},_)) _ = Zero
|
||||
|conv (Const(@{const_name "Regular_Exp.rexp.One"},_)) _ = Onea
|
||||
|conv (Const(@{const_name "Regular_Exp.rexp.Times"},_) $ X $ Y) env = Times(conv X env, conv Y env)
|
||||
|conv (Const(@{const_name "Regular_Exp.rexp.Plus"},_) $ X $ Y) env = Plus(conv X env, conv Y env)
|
||||
|conv (Const(@{const_name "Regular_Exp.rexp.Star"},_) $ X) env = Star(conv X env)
|
||||
|conv (Const(@{const_name "RegExpInterface.opt"},_) $ X) env = Plus(conv X env, Onea)
|
||||
|conv (Const(@{const_name "RegExpInterface.rep1"},_) $ X) env = Times(conv X env, Star(conv X env))
|
||||
|conv (Const (s, Type(@{type_name "rexp"},_))) env =
|
||||
fun conv \<^Const_>\<open>Regular_Exp.rexp.Zero _\<close> _ = Zero
|
||||
|conv \<^Const_>\<open>Regular_Exp.rexp.One _\<close> _ = Onea
|
||||
|conv \<^Const_>\<open>Regular_Exp.rexp.Times _ for X Y\<close> env = Times(conv X env, conv Y env)
|
||||
|conv \<^Const_>\<open>Regular_Exp.rexp.Plus _ for X Y\<close> env = Plus(conv X env, conv Y env)
|
||||
|conv \<^Const_>\<open>Regular_Exp.rexp.Star _ for X\<close> env = Star(conv X env)
|
||||
|conv \<^Const_>\<open>RegExpInterface.opt _ for X\<close> env = Plus(conv X env, Onea)
|
||||
|conv \<^Const_>\<open>RegExpInterface.rep1 _ for X\<close> env = Times(conv X env, Star(conv X env))
|
||||
|conv (Const (s, \<^Type>\<open>rexp _\<close>)) env =
|
||||
let val n = find_index (fn x => x = s) env
|
||||
val _ = if n<0 then error"conversion error of regexp." else ()
|
||||
in Atom(n) end
|
||||
|
|
2
src/ROOT
2
src/ROOT
|
@ -14,4 +14,6 @@ session "Isabelle_DOF" = "Functional-Automata" +
|
|||
theories
|
||||
"DOF/Isa_DOF"
|
||||
"ontologies/ontologies"
|
||||
export_classpath
|
||||
|
||||
|
||||
|
|
|
@ -19,8 +19,6 @@
|
|||
%% you need to download lipics.cls from
|
||||
%% https://www.dagstuhl.de/en/publications/lipics/instructions-for-authors/
|
||||
%% and add it manually to the praemble.tex and the ROOT file.
|
||||
%% Moreover, the option "document_comment_latex=true" needs to be set
|
||||
%% in the ROOT file.
|
||||
%%
|
||||
%% All customization and/or additional packages should be added to the file
|
||||
%% preamble.tex.
|
||||
|
|
|
@ -173,8 +173,9 @@ Definition* [eal_def, tag= "''evaluation assurance level''"]
|
|||
CC predefined assurance scale, that form an assurance package\<close>
|
||||
|
||||
Definition* [eval_auth_def, tag="''evaluation authority''"]
|
||||
\<open>body that sets the standards and monitors the quality of evaluations conducted by bodies within a specific community and
|
||||
implements the CC for that community by means of an evaluation scheme\<close>
|
||||
\<open>body that sets the standards and monitors the quality of evaluations conducted
|
||||
by bodies within a specific community and implements the CC for that community
|
||||
by means of an evaluation scheme\<close>
|
||||
|
||||
Definition* [eval_schm_def, tag="''evaluation scheme''"]
|
||||
\<open>administrative and regulatory framework under which the CC is applied by an
|
||||
|
|
|
@ -28,6 +28,8 @@ theory CENELEC_50128
|
|||
imports "Isabelle_DOF.technical_report"
|
||||
begin
|
||||
|
||||
define_ontology "DOF-CENELEC_50128.sty"
|
||||
|
||||
(* this is a hack and should go into an own ontology, providing thingsd like:
|
||||
- Assumption*
|
||||
- Hypothesis*
|
||||
|
@ -155,10 +157,10 @@ which have the required safety integrity level.\<close>
|
|||
Definition*[entity]
|
||||
\<open>person, group or organisation who fulfils a role as defined in this European Standard.\<close>
|
||||
|
||||
declare_reference*[fault]
|
||||
declare_reference*[fault::cenelec_term]
|
||||
Definition*[error]
|
||||
\<open>defect, mistake or inaccuracy which could result in failure or in a deviation
|
||||
from the intended performance or behaviour (cf. @{cenelec_term (unchecked) \<open>fault\<close>})).\<close>
|
||||
from the intended performance or behaviour (cf. @{cenelec_term (unchecked) \<open>fault\<close>}).\<close>
|
||||
|
||||
Definition*[fault]
|
||||
\<open>defect, mistake or inaccuracy which could result in failure or in a deviation
|
||||
|
@ -1007,14 +1009,16 @@ ML\<open>
|
|||
fun check_sil oid _ ctxt =
|
||||
let
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
|
||||
val DOF_core.Instance {value = monitor_record_value, ...} =
|
||||
DOF_core.get_object_global (oid, Position.none) (Context.theory_of ctxt)
|
||||
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||
fun check_sil'' [] = true
|
||||
| check_sil'' (x::xs) =
|
||||
let
|
||||
val (_, doc_oid) = x
|
||||
val doc_record_value = #value (the (DOF_core.get_object_local doc_oid ctxt'))
|
||||
val DOF_core.Instance {value = doc_record_value, ...} =
|
||||
DOF_core.get_object_global (doc_oid, Position.none) (Context.theory_of ctxt)
|
||||
val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
|
||||
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext
|
||||
in
|
||||
|
@ -1036,17 +1040,20 @@ ML\<open>
|
|||
fun check_sil_slow oid _ ctxt =
|
||||
let
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
|
||||
val monitor_cid = #cid (the (DOF_core.get_object_local oid ctxt'))
|
||||
val DOF_core.Instance {value = monitor_record_value, ...} =
|
||||
DOF_core.get_object_global (oid, Position.none) (Context.theory_of ctxt)
|
||||
val DOF_core.Instance {cid = monitor_cid, ...} =
|
||||
DOF_core.get_object_global (oid, Position.none) (Context.theory_of ctxt)
|
||||
val monitor_sil_typ = (Syntax.read_typ ctxt' monitor_cid) --> @{typ "sil"}
|
||||
val monitor_sil = Value_Command.value ctxt'
|
||||
(Const("CENELEC_50128.monitor_SIL.sil", monitor_sil_typ) $ monitor_record_value)
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||
fun check_sil' [] = true
|
||||
| check_sil' (x::xs) =
|
||||
let
|
||||
val (doc_cid, doc_oid) = x
|
||||
val doc_record_value = #value (the (DOF_core.get_object_local doc_oid ctxt'))
|
||||
val DOF_core.Instance {value = doc_record_value, ...} =
|
||||
DOF_core.get_object_global (doc_oid, Position.none) (Context.theory_of ctxt)
|
||||
val doc_sil_typ = (Syntax.read_typ ctxt' doc_cid) --> @{typ "sil"}
|
||||
val doc_sil = Value_Command.value ctxt'
|
||||
(Const ("CENELEC_50128.cenelec_document.sil", doc_sil_typ) $ doc_record_value)
|
||||
|
@ -1062,17 +1069,17 @@ fun check_sil_slow oid _ ctxt =
|
|||
(*setup\<open>DOF_core.update_class_invariant "CENELEC_50128.monitor_SIL0" check_sil_slow\<close>*)
|
||||
|
||||
(* As traces of monitor instances (docitems) are updated each time an instance is declared
|
||||
(with text*, section*, etc.), invariants checking functions which use traces must
|
||||
be declared as lazy invariants, to be checked only when closing a monitor, i.e.,
|
||||
after the monitor traces are populated.
|
||||
(with text*, section*, etc.), invariants checking functions which check the full list of traces
|
||||
must be declared as lazy invariants, to be checked only when closing a monitor, i.e.,
|
||||
after all the monitor traces are populated.
|
||||
*)
|
||||
ML\<open>
|
||||
fun check_required_documents oid _ ctxt =
|
||||
let
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
val {monitor_tab,...} = DOF_core.get_data ctxt'
|
||||
val {accepted_cids, ...} = the (Symtab.lookup monitor_tab oid)
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
||||
val DOF_core.Monitor_Info {accepted_cids, ...} =
|
||||
DOF_core.get_monitor_info_global (oid, Position.none) (Context.theory_of ctxt)
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||
fun check_required_documents' [] = true
|
||||
| check_required_documents' (cid::cids) =
|
||||
if exists (fn (doc_cid, _) => equal cid doc_cid) traces
|
||||
|
@ -1080,7 +1087,8 @@ fun check_required_documents oid _ ctxt =
|
|||
else
|
||||
let
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
|
||||
val DOF_core.Instance {value = monitor_record_value, ...} =
|
||||
DOF_core.get_object_global (oid, Position.none) (Context.theory_of ctxt)
|
||||
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
|
||||
in error ("A " ^ cid ^ " cenelec document is required with "
|
||||
^ Syntax.string_of_term ctxt' monitor_sil)
|
||||
|
@ -1099,11 +1107,11 @@ text*[MonitorPatternMatchingTest::monitor_SIL0]\<open>\<close>
|
|||
text*[CenelecClassPatternMatchingTest::SQAP, sil = "SIL0"]\<open>\<close>
|
||||
ML\<open>
|
||||
val thy = @{theory}
|
||||
val monitor_record_value =
|
||||
#value (the (DOF_core.get_object_global "MonitorPatternMatchingTest" thy))
|
||||
val DOF_core.Instance {value = monitor_record_value, ...} =
|
||||
DOF_core.get_object_global ("MonitorPatternMatchingTest", Position.none) thy
|
||||
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
|
||||
val doc_record_value = #value (the (DOF_core.get_object_global
|
||||
"CenelecClassPatternMatchingTest" thy))
|
||||
val DOF_core.Instance {value = doc_record_value, ...} =
|
||||
DOF_core.get_object_global ("CenelecClassPatternMatchingTest", Position.none) thy
|
||||
val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
|
||||
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext
|
||||
\<close>
|
||||
|
@ -1264,8 +1272,9 @@ ML
|
|||
DOF_core.is_subclass @{context} "CENELEC_50128.EC" "CENELEC_50128.test_requirement"; \<close>
|
||||
|
||||
ML
|
||||
\<open> val {docobj_tab={maxano, tab=ref_tab},docclass_tab=class_tab,...} = DOF_core.get_data @{context};
|
||||
Symtab.dest ref_tab;
|
||||
\<open> val ref_tab = DOF_core.get_instances \<^context>
|
||||
val {docclass_tab=class_tab,...} = DOF_core.get_data @{context};
|
||||
Name_Space.dest_table ref_tab;
|
||||
Symtab.dest class_tab; \<close>
|
||||
|
||||
ML
|
||||
|
|
|
@ -28,7 +28,7 @@ doc_class A =
|
|||
subsection\<open>Excursion: On the semantic consequences of this definition: \<close>
|
||||
|
||||
text\<open>This class definition leads an implicit Isabelle/HOL \<^theory_text>\<open>record\<close> definition
|
||||
(cf. \<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>, chapter 11.6.).
|
||||
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, chapter 11.6.).
|
||||
Consequently, \<^theory_text>\<open>doc_class\<close>'es inherit the entire theory-infrastructure from Isabelle records:
|
||||
\<^enum> there is a HOL-type \<^typ>\<open>A\<close> and its extensible version \<^typ>\<open>'a A_scheme\<close>
|
||||
\<^enum> there are HOL-terms representing \<^emph>\<open>doc_class instances\<close> with the high-level syntax:
|
||||
|
@ -177,4 +177,49 @@ text*[ sdfg :: F] \<open> Lorem ipsum @{thm refl}\<close>
|
|||
text*[ xxxy ] \<open> Lorem ipsum @{F \<open>sdfg\<close>} rate @{thm refl}\<close>
|
||||
close_monitor*[aaa]
|
||||
|
||||
end
|
||||
doc_class test_monitor_free =
|
||||
tmhd :: int
|
||||
doc_class test_monitor_head =
|
||||
tmhd :: int
|
||||
doc_class test_monitor_A = test_monitor_head +
|
||||
tmA :: int
|
||||
doc_class test_monitor_B = test_monitor_A +
|
||||
tmB :: int
|
||||
doc_class test_monitor_C = test_monitor_A +
|
||||
tmC :: int
|
||||
doc_class test_monitor_D = test_monitor_B +
|
||||
tmD :: int
|
||||
doc_class test_monitor_E = test_monitor_D +
|
||||
tmE :: int
|
||||
|
||||
doc_class monitor_M =
|
||||
tmM :: int
|
||||
rejects "test_monitor_A"
|
||||
accepts "test_monitor_head ~~ test_monitor_B ~~ test_monitor_C"
|
||||
|
||||
declare[[free_class_in_monitor_checking]]
|
||||
|
||||
open_monitor*[test_monitor_M::monitor_M]
|
||||
|
||||
text*[testFree::test_monitor_free]\<open>\<close>
|
||||
|
||||
open_monitor*[test_monitor_M2::monitor_M]
|
||||
|
||||
text*[test_monitor_A1::test_monitor_A]\<open>\<close>
|
||||
text*[testFree2::test_monitor_free]\<open>\<close>
|
||||
text*[test_monitor_head1::test_monitor_head]\<open>\<close>
|
||||
text*[testFree3::test_monitor_free]\<open>\<close>
|
||||
text*[test_monitor_B1::test_monitor_B]\<open>\<close>
|
||||
text*[testFree4::test_monitor_free]\<open>\<close>
|
||||
text*[test_monitor_D1::test_monitor_D]\<open>\<close>
|
||||
text*[testFree5::test_monitor_free]\<open>\<close>
|
||||
text*[test_monitor_C1::test_monitor_C]\<open>\<close>
|
||||
text*[testFree6::test_monitor_free]\<open>\<close>
|
||||
|
||||
close_monitor*[test_monitor_M2]
|
||||
|
||||
close_monitor*[test_monitor_M]
|
||||
|
||||
declare[[free_class_in_monitor_checking = false]]
|
||||
|
||||
end
|
||||
|
|
|
@ -18,6 +18,8 @@
|
|||
\RequirePackage{amsthm}
|
||||
|
||||
\newtheorem{example}{Example}
|
||||
%\newtheorem{assertion}{Assumption} %% Hack
|
||||
\newtheorem{assumption}{Assumption}
|
||||
\newtheorem{definition}{Definition}
|
||||
\newtheorem{theorem}{Theorem}
|
||||
|
||||
|
|
|
@ -171,12 +171,12 @@
|
|||
|
||||
% \newcommand{\formalMathStmt[label, mcc, ]
|
||||
% end: scholarly_paper.abstract
|
||||
% | "rule" | "assn" | "expl" | rem | "notation" | "terminology"
|
||||
% | "rule" | "assn" | "assm" | "expl" | rem | "notation" | "terminology"
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
\newisadof{text.scholarly_paper.math_content}%
|
||||
[label=,type=%
|
||||
, scholarly_paper.math_content.short_name ={} % {} or \relax
|
||||
, scholarly_paper.math_content.mcc = % "def" | "axm" | "thm" | "lem" | "prop" | "rule" | "assn"
|
||||
, scholarly_paper.math_content.mcc = % "def" | "axm" | "thm" | "lem" | "prop" | "rule" | "assn" | "assm"
|
||||
, Isa_COL.text_element.level =%
|
||||
, Isa_COL.text_element.referentiable =%
|
||||
, Isa_COL.text_element.variants =%
|
||||
|
@ -279,6 +279,15 @@
|
|||
\end{assertion}}
|
||||
}%
|
||||
}%
|
||||
{\ifthenelse{\equal{\commandkey{scholarly_paper.math_content.mcc}}{assm}}
|
||||
{%
|
||||
{\ifthenelse{\equal{\commandkey{scholarly_paper.math_content.short_name}} {} }
|
||||
{\begin{assumption} \label{\commandkey{label}} #1 \end{assumption} }
|
||||
{\begin{assumption} [\commandkey{scholarly_paper.math_content.short_name}]
|
||||
\label{\commandkey{label}} #1
|
||||
\end{assumption}}
|
||||
}%
|
||||
}%
|
||||
{\ifthenelse{\equal{\commandkey{scholarly_paper.math_content.mcc}}{notation}}
|
||||
{%
|
||||
{\ifthenelse{\equal{\commandkey{scholarly_paper.math_content.short_name}} {} }
|
||||
|
@ -310,6 +319,7 @@
|
|||
}%
|
||||
}%
|
||||
}%
|
||||
}%
|
||||
}%
|
||||
\end{isamarkuptext}%
|
||||
}
|
||||
|
|
|
@ -17,9 +17,11 @@ theory scholarly_paper
|
|||
imports "Isabelle_DOF.Isa_COL"
|
||||
keywords "author*" "abstract*"
|
||||
"Definition*" "Lemma*" "Theorem*" :: document_body
|
||||
|
||||
begin
|
||||
|
||||
define_ontology "DOF-scholarly_paper.sty"
|
||||
define_ontology "DOF-scholarly_paper-thm.sty"
|
||||
|
||||
text\<open>Scholarly Paper provides a number of standard text - elements for scientific papers.
|
||||
They were introduced in the following.\<close>
|
||||
|
||||
|
@ -46,12 +48,12 @@ doc_class abstract =
|
|||
|
||||
ML\<open>
|
||||
val _ =
|
||||
Monitor_Command_Parser.document_command ("abstract*", @{here}) "Textual Definition"
|
||||
Monitor_Command_Parser.document_command \<^command_keyword>\<open>abstract*\<close> "Textual Definition"
|
||||
{markdown = true, body = true}
|
||||
(Onto_Macros.enriched_document_cmd_exp (SOME "abstract") []);
|
||||
|
||||
val _ =
|
||||
Monitor_Command_Parser.document_command ("author*", @{here}) "Textual Definition"
|
||||
Monitor_Command_Parser.document_command \<^command_keyword>\<open>author*\<close> "Textual Definition"
|
||||
{markdown = true, body = true}
|
||||
(Onto_Macros.enriched_document_cmd_exp (SOME "author") []);
|
||||
\<close>
|
||||
|
@ -67,7 +69,7 @@ which we formalize into:\<close>
|
|||
doc_class text_section = text_element +
|
||||
main_author :: "author option" <= None
|
||||
fixme_list :: "string list" <= "[]"
|
||||
level :: "int option" <= "None"
|
||||
level :: "int option" <= "None"
|
||||
(* this attribute enables doc-notation support section* etc.
|
||||
we follow LaTeX terminology on levels
|
||||
part = Some -1
|
||||
|
@ -81,15 +83,19 @@ doc_class text_section = text_element +
|
|||
|
||||
doc_class "conclusion" = text_section +
|
||||
main_author :: "author option" <= None
|
||||
invariant L\<^sub>c\<^sub>o\<^sub>n\<^sub>c :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 0"
|
||||
|
||||
doc_class related_work = "conclusion" +
|
||||
main_author :: "author option" <= None
|
||||
invariant L\<^sub>r\<^sub>w :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 0"
|
||||
|
||||
doc_class bibliography = text_section +
|
||||
style :: "string option" <= "Some ''LNCS''"
|
||||
invariant L\<^sub>b\<^sub>i\<^sub>b :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 0"
|
||||
|
||||
doc_class annex = "text_section" +
|
||||
main_author :: "author option" <= None
|
||||
invariant L\<^sub>a\<^sub>n\<^sub>n\<^sub>e\<^sub>x :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 0"
|
||||
|
||||
(*
|
||||
datatype sc_dom = math | info | natsc | eng
|
||||
|
@ -101,6 +107,7 @@ subsection\<open>Introductions\<close>
|
|||
doc_class introduction = text_section +
|
||||
comment :: string
|
||||
claims :: "thm list"
|
||||
invariant L\<^sub>i\<^sub>n\<^sub>t\<^sub>r\<^sub>o :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 0"
|
||||
|
||||
text\<open>Technical text-elements posses a status: they can be either an \<^emph>\<open>informal explanation\<close> /
|
||||
description or a kind of introductory text to definition etc. or a \<^emph>\<open>formal statement\<close> similar
|
||||
|
@ -115,6 +122,7 @@ A formal statement can, but must not have a reference to true formal Isabelle/Is
|
|||
doc_class background = text_section +
|
||||
comment :: string
|
||||
claims :: "thm list"
|
||||
invariant L\<^sub>b\<^sub>a\<^sub>c\<^sub>k :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 0"
|
||||
|
||||
|
||||
subsection\<open>Technical Content and its Formats\<close>
|
||||
|
@ -132,7 +140,7 @@ doc_class technical = text_section +
|
|||
definition_list :: "string list" <= "[]"
|
||||
status :: status <= "description"
|
||||
formal_results :: "thm list"
|
||||
invariant L1 :: "the (level \<sigma>) > 0"
|
||||
invariant L\<^sub>t\<^sub>e\<^sub>c\<^sub>h :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 0"
|
||||
|
||||
type_synonym tc = technical (* technical content *)
|
||||
|
||||
|
@ -148,6 +156,7 @@ doc_class example = text_section +
|
|||
referentiable :: bool <= True
|
||||
status :: status <= "description"
|
||||
short_name :: string <= "''''"
|
||||
invariant L\<^sub>e\<^sub>x\<^sub>a :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 0"
|
||||
|
||||
|
||||
subsection\<open>Freeform Mathematical Content\<close>
|
||||
|
@ -158,7 +167,7 @@ provided \<^emph>\<open>theorem environments\<close> (see \<^verbatim>\<open>tex
|
|||
that it is well-established and compatible with many LaTeX - styles.\<close>
|
||||
|
||||
datatype math_content_class = "defn" | "axm" | "thm" | "lem" | "cor" | "prop"
|
||||
| "expl" | "rule" | "assn"
|
||||
| "expl" | "rule" | "assn" | "assm"
|
||||
| rem | "notation" | "terminology"
|
||||
|
||||
(*
|
||||
|
@ -289,7 +298,7 @@ setup\<open>Theorem_default_class_setup\<close>
|
|||
ML\<open> local open ODL_Meta_Args_Parser in
|
||||
|
||||
val _ =
|
||||
Monitor_Command_Parser.document_command ("Definition*", @{here}) "Textual Definition"
|
||||
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Definition*\<close> "Textual Definition"
|
||||
{markdown = true, body = true}
|
||||
(fn meta_args => fn thy =>
|
||||
let
|
||||
|
@ -301,7 +310,7 @@ val _ =
|
|||
end);
|
||||
|
||||
val _ =
|
||||
Monitor_Command_Parser.document_command ("Lemma*", @{here}) "Textual Lemma Outline"
|
||||
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Lemma*\<close> "Textual Lemma Outline"
|
||||
{markdown = true, body = true}
|
||||
(fn meta_args => fn thy =>
|
||||
let
|
||||
|
@ -313,7 +322,7 @@ val _ =
|
|||
end);
|
||||
|
||||
val _ =
|
||||
Monitor_Command_Parser.document_command ("Theorem*", @{here}) "Textual Theorem Outline"
|
||||
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Theorem*\<close> "Textual Theorem Outline"
|
||||
{markdown = true, body = true}
|
||||
(fn meta_args => fn thy =>
|
||||
let
|
||||
|
@ -453,10 +462,10 @@ fun dest_option _ (Const (@{const_name "None"}, _)) = NONE
|
|||
|
||||
in
|
||||
|
||||
fun check ctxt cidS mon_id pos =
|
||||
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos @{here}
|
||||
fun check ctxt cidS mon_id pos_opt =
|
||||
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos_opt @{here}
|
||||
val groups = partition (Context.proof_of ctxt) cidS trace
|
||||
fun get_level_raw oid = AttributeAccess.compute_attr_access ctxt "level" oid @{here} @{here};
|
||||
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
|
||||
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
|
||||
fun check_level_hd a = case (get_level (snd a)) of
|
||||
NONE => error("Invariant violation: leading section" ^ snd a ^
|
||||
|
@ -484,7 +493,7 @@ end
|
|||
setup\<open> let val cidS = ["scholarly_paper.introduction","scholarly_paper.technical",
|
||||
"scholarly_paper.example", "scholarly_paper.conclusion"];
|
||||
fun body moni_oid _ ctxt = (Scholarly_paper_trace_invariant.check
|
||||
ctxt cidS moni_oid @{here};
|
||||
ctxt cidS moni_oid NONE;
|
||||
true)
|
||||
in DOF_core.update_class_invariant "scholarly_paper.article" body end\<close>
|
||||
|
||||
|
|
|
@ -65,8 +65,8 @@ doc_class result = technical +
|
|||
|
||||
|
||||
ML\<open>fun check_invariant_invariant oid {is_monitor:bool} ctxt =
|
||||
let val kind_term = AttributeAccess.compute_attr_access ctxt "kind" oid @{here} @{here}
|
||||
val property_termS = AttributeAccess.compute_attr_access ctxt "property" oid @{here} @{here}
|
||||
let val kind_term = ISA_core.compute_attr_access ctxt "kind" oid NONE @{here}
|
||||
val property_termS = ISA_core.compute_attr_access ctxt "property" oid NONE @{here}
|
||||
val tS = HOLogic.dest_list property_termS
|
||||
in case kind_term of
|
||||
@{term "proof"} => if not(null tS) then true
|
||||
|
@ -137,10 +137,10 @@ fun dest_option _ (Const (@{const_name "None"}, _)) = NONE
|
|||
|
||||
in
|
||||
|
||||
fun check ctxt cidS mon_id pos =
|
||||
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos @{here}
|
||||
fun check ctxt cidS mon_id pos_opt =
|
||||
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos_opt @{here}
|
||||
val groups = partition (Context.proof_of ctxt) cidS trace
|
||||
fun get_level_raw oid = AttributeAccess.compute_attr_access ctxt "level" oid @{here} @{here};
|
||||
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
|
||||
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
|
||||
fun check_level_hd a = case (get_level (snd a)) of
|
||||
NONE => error("Invariant violation: leading section" ^ snd a ^
|
||||
|
@ -165,7 +165,7 @@ end
|
|||
\<close>
|
||||
|
||||
setup\<open> let val cidS = ["small_math.introduction","small_math.technical", "small_math.conclusion"];
|
||||
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid @{here};
|
||||
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid NONE;
|
||||
true)
|
||||
in DOF_core.update_class_invariant "small_math.article" body end\<close>
|
||||
|
||||
|
|
|
@ -17,6 +17,8 @@ theory technical_report
|
|||
imports "Isabelle_DOF.scholarly_paper"
|
||||
begin
|
||||
|
||||
define_ontology "DOF-technical_report.sty"
|
||||
|
||||
(* for reports paper: invariant: level \<ge> -1 *)
|
||||
|
||||
section\<open>More Global Text Elements for Reports\<close>
|
||||
|
|
|
@ -0,0 +1,7 @@
|
|||
session "Isabelle_DOF-Proofs" = "HOL-Proofs" +
|
||||
options [document = false, record_proofs = 2, parallel_limit = 500, document_build = dof]
|
||||
sessions
|
||||
"Isabelle_DOF"
|
||||
theories
|
||||
Isabelle_DOF.ontologies
|
||||
Isabelle_DOF.Isa_DOF
|
|
@ -0,0 +1,120 @@
|
|||
/*
|
||||
* Copyright (c)
|
||||
* 2021-2022 The University of Exeter.
|
||||
* 2021-2022 The University of Paris-Saclay.
|
||||
*
|
||||
* Redistribution and use in source and binary forms, with or without
|
||||
* modification, are permitted provided that the following conditions
|
||||
* are met:
|
||||
* 1. Redistributions of source code must retain the above copyright
|
||||
* notice, this list of conditions and the following disclaimer.
|
||||
* 2. Redistributions in binary form must reproduce the above copyright
|
||||
* notice, this list of conditions and the following disclaimer in
|
||||
* the documentation and/or other materials provided with the
|
||||
* distribution.
|
||||
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
|
||||
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
|
||||
* COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
|
||||
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
||||
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
|
||||
* ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
|
||||
* POSSIBILITY OF SUCH DAMAGE.
|
||||
*
|
||||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*/
|
||||
|
||||
|
||||
/*** constants and parameters for Isabelle/DOF ***/
|
||||
|
||||
package isabelle.dof
|
||||
|
||||
import isabelle._
|
||||
|
||||
|
||||
object DOF {
|
||||
/** parameters **/
|
||||
|
||||
val isabelle_version = "2022"
|
||||
val isabelle_url = "https://isabelle.in.tum.de/website-Isabelle2022"
|
||||
|
||||
val afp_version = "afp-2022-10-27"
|
||||
|
||||
// Isabelle/DOF version: "Unreleased" for development, semantic version for releases
|
||||
val version = "Unreleased"
|
||||
|
||||
val session = "Isabelle_DOF"
|
||||
|
||||
val latest_version = "1.3.0"
|
||||
val latest_isabelle = "Isabelle2021-1"
|
||||
val latest_doi = "10.5281/zenodo.6810799"
|
||||
val generic_doi = "10.5281/zenodo.3370482"
|
||||
|
||||
// Isabelle/DOF source repository
|
||||
val url = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF"
|
||||
|
||||
// Isabelle/DOF release artifacts
|
||||
val artifact_dir = "releases/Isabelle_DOF/Isabelle_DOF"
|
||||
val artifact_host = "artifacts.logicalhacking.com"
|
||||
val artifact_url: String = "https://" + artifact_host + "/" + artifact_dir
|
||||
|
||||
def options(opts: Options): Options = opts + "document_comment_latex"
|
||||
|
||||
|
||||
|
||||
/** Isabelle tool wrapper **/
|
||||
|
||||
sealed case class Parameter(name: String, value: String) {
|
||||
override def toString: String = name
|
||||
|
||||
def print(value_only: Boolean): String =
|
||||
if (value_only) value else name + "=" + value
|
||||
}
|
||||
|
||||
val parameters: List[Parameter] =
|
||||
List(
|
||||
Parameter("isabelle_version", isabelle_version),
|
||||
Parameter("afp_version", afp_version),
|
||||
Parameter("dof_version", version)
|
||||
).sortBy(_.name)
|
||||
|
||||
def print_parameters(names: List[String],
|
||||
all: Boolean = false,
|
||||
value_only: Boolean = false,
|
||||
progress: Progress = new Progress
|
||||
): Unit = {
|
||||
val bad = names.filter(name => !parameters.exists(_.name == name))
|
||||
if (bad.nonEmpty) error("Unknown parameter(s): " + commas_quote(bad))
|
||||
|
||||
val params = if (all) parameters else parameters.filter(p => names.contains(p.name))
|
||||
for (p <- params) progress.echo(p.print(value_only))
|
||||
}
|
||||
|
||||
val isabelle_tool = Isabelle_Tool("dof_param", "print Isabelle/DOF parameters",
|
||||
Scala_Project.here, args =>
|
||||
{
|
||||
var all = false
|
||||
var value_only = false
|
||||
|
||||
val getopts = Getopts("""
|
||||
Usage: isabelle dof_param [OPTIONS] NAMES
|
||||
|
||||
Options are:
|
||||
-a print all parameters
|
||||
-b print values only (default: NAME=VALUE)
|
||||
|
||||
Print given Isabelle/DOF parameters, with names from the list:
|
||||
""" + commas_quote(parameters.map(_.toString)),
|
||||
"a" -> (_ => all = true),
|
||||
"b" -> (_ => value_only = true))
|
||||
|
||||
val names = getopts(args)
|
||||
if (names.isEmpty && !all) getopts.usage()
|
||||
|
||||
print_parameters(names, all = all, value_only = value_only, progress = new Console_Progress)
|
||||
})
|
||||
}
|
|
@ -1,7 +1,7 @@
|
|||
/*
|
||||
* Copyright (c)
|
||||
* 2021-2022 The University of Exeter.
|
||||
* 2021-2022 The University of Paris-Saclay.
|
||||
* Copyright (c)
|
||||
* 2021-2022 The University of Exeter.
|
||||
* 2021-2022 The University of Paris-Saclay.
|
||||
*
|
||||
* Redistribution and use in source and binary forms, with or without
|
||||
* modification, are permitted provided that the following conditions
|
||||
|
@ -28,6 +28,8 @@
|
|||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*/
|
||||
|
||||
/*** document build engine for Isabelle/DOF ***/
|
||||
|
||||
package isabelle.dof
|
||||
|
||||
import isabelle._
|
||||
|
@ -37,61 +39,61 @@ object DOF_Document_Build
|
|||
{
|
||||
class Engine extends Document_Build.Bash_Engine("dof")
|
||||
{
|
||||
// override def use_build_script: Boolean = true
|
||||
def the_document_entry(context: Document_Build.Context, name: String): Export.Entry = {
|
||||
val entries =
|
||||
for {
|
||||
node_name <- context.document_theories
|
||||
entry <- context.session_context.get(node_name.theory, name)
|
||||
} yield entry
|
||||
|
||||
entries match {
|
||||
case List(entry) => entry
|
||||
case Nil =>
|
||||
error("Missing export " + quote(name) + " for document theories of session " +
|
||||
quote(context.session))
|
||||
case dups =>
|
||||
error("Multiple exports " + quote(name) + " for theories " +
|
||||
commas_quote(dups.map(_.theory_name).sorted.distinct))
|
||||
}
|
||||
}
|
||||
|
||||
override def prepare_directory(
|
||||
context: Document_Build.Context,
|
||||
dir: Path,
|
||||
doc: Document_Build.Document_Variant): Document_Build.Directory =
|
||||
{
|
||||
val regex = """^.*\.""".r
|
||||
val latex_output = new Latex_Output(context.options)
|
||||
val options = DOF.options(context.options)
|
||||
val latex_output = new Latex_Output(options)
|
||||
val directory = context.prepare_directory(dir, doc, latex_output)
|
||||
|
||||
// produced by alternative presentation hook (workaround for missing Toplevel.present_theory)
|
||||
for {
|
||||
name <- context.document_theories.iterator
|
||||
entry <- context.session_context.get(name.theory, Export.DOCUMENT_LATEX + "_dof")
|
||||
} {
|
||||
val path = Path.basic(Document_Build.tex_name(name))
|
||||
val xml = YXML.parse_body(entry.text)
|
||||
File.content(path, xml).output(latex_output(_, file_pos = path.implode_symbolic))
|
||||
.write(directory.doc_dir)
|
||||
}
|
||||
val isabelle_dof_dir = context.session_context.sessions_structure("Isabelle_DOF").dir
|
||||
// print(context.options.string("dof_url"));
|
||||
|
||||
// copy Isabelle/DOF LaTeX templates
|
||||
val template_dir = isabelle_dof_dir + Path.explode("document-templates")
|
||||
// TODO: error handling in case 1) template does not exist or 2) root.tex does already exist
|
||||
val template = regex.replaceAllIn(context.options.string("dof_template"),"")
|
||||
Isabelle_System.copy_file(template_dir + Path.explode("root-"+template+".tex"),
|
||||
directory.doc_dir+Path.explode("root.tex"))
|
||||
val isabelle_dof_dir = context.session_context.sessions_structure(DOF.session).dir
|
||||
|
||||
// copy Isabelle/DOF LaTeX styles
|
||||
// LaTeX styles from Isabelle/DOF directory
|
||||
List(Path.explode("DOF/latex"), Path.explode("ontologies"))
|
||||
.flatMap(dir =>
|
||||
File.find_files((isabelle_dof_dir + dir).file,
|
||||
file => file.getName.endsWith(".sty"), include_dirs = true))
|
||||
.flatMap(dir => File.find_files((isabelle_dof_dir + dir).file, _.getName.endsWith(".sty")))
|
||||
.foreach(sty => Isabelle_System.copy_file(sty, directory.doc_dir.file))
|
||||
|
||||
// create ontology.sty
|
||||
val ltx_styles = context.options.string("dof_ontologies").split(" +").map(s => regex.replaceAllIn(s,""))
|
||||
File.write(directory.doc_dir+Path.explode("ontologies.tex"),
|
||||
ltx_styles.mkString("\\usepackage{DOF-","}\n\\usepackage{DOF-","}\n"))
|
||||
// ontologies.tex from session exports
|
||||
File.write(directory.doc_dir + Path.explode("ontologies.tex"),
|
||||
split_lines(the_document_entry(context, "dof/use_ontology").text)
|
||||
.map(name => "\\usepackage{DOF-" + name + "}\n").mkString)
|
||||
|
||||
// create dof-config.sty
|
||||
File.write(directory.doc_dir+Path.explode("dof-config.sty"), """
|
||||
\newcommand{\isabelleurl}{https://isabelle.in.tum.de/website-Isabelle2022/""" + context.options.string("dof_isabelle") + """}
|
||||
\newcommand{\dofurl}{""" + context.options.string("dof_url") + """}
|
||||
\newcommand{\dof@isabelleversion}{""" + context.options.string("dof_isabelle") + """}
|
||||
\newcommand{\isabellefullversion}{""" + context.options.string("dof_isabelle") + """\xspace}
|
||||
\newcommand{\dof@version}{""" + context.options.string("dof_version") + """}
|
||||
\newcommand{\dof@artifacturl}{""" + context.options.string("dof_artifact_dir") + """}
|
||||
\newcommand{\doflatestversion}{""" + context.options.string("dof_latest_version") + """}
|
||||
\newcommand{\isadoflatestdoi}{""" + context.options.string("dof_latest_doi") + """}
|
||||
\newcommand{\isadofgenericdoi}{""" + context.options.string("dof_generic_doi") + """}
|
||||
\newcommand{\isabellelatestversion}{""" + context.options.string("dof_latest_isabelle") + """}
|
||||
// root.tex from session exports
|
||||
File.write(directory.doc_dir + Path.explode("root.tex"),
|
||||
the_document_entry(context, "dof/use_template").text)
|
||||
|
||||
// dof-config.sty
|
||||
File.write(directory.doc_dir + Path.explode("dof-config.sty"), """
|
||||
\newcommand{\isabelleurl}{""" + DOF.isabelle_url + """}
|
||||
\newcommand{\dofurl}{""" + DOF.url + """}
|
||||
\newcommand{\dof@isabelleversion}{""" + DOF.isabelle_version + """}
|
||||
\newcommand{\isabellefullversion}{""" + DOF.isabelle_version + """\xspace}
|
||||
\newcommand{\dof@version}{""" + DOF.version + """}
|
||||
\newcommand{\dof@artifacturl}{""" + DOF.artifact_url + """}
|
||||
\newcommand{\doflatestversion}{""" + DOF.latest_version + """}
|
||||
\newcommand{\isadoflatestdoi}{""" + DOF.latest_doi + """}
|
||||
\newcommand{\isadofgenericdoi}{""" + DOF.generic_doi + """}
|
||||
\newcommand{\isabellelatestversion}{""" + DOF.latest_isabelle + """}
|
||||
""")
|
||||
directory
|
||||
}
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
/*
|
||||
* Copyright (c)
|
||||
* 2021-2022 The University of Exeter.
|
||||
* 2021-2022 The University of Paris-Saclay.
|
||||
* Copyright (c)
|
||||
* 2021-2022 The University of Exeter.
|
||||
* 2021-2022 The University of Paris-Saclay.
|
||||
*
|
||||
* Redistribution and use in source and binary forms, with or without
|
||||
* modification, are permitted provided that the following conditions
|
||||
|
@ -29,32 +29,32 @@
|
|||
*/
|
||||
|
||||
|
||||
/* Author: Makarius
|
||||
|
||||
Prepare session root directory for Isabelle/DOF.
|
||||
*/
|
||||
/*** create session root directory for Isabelle/DOF ***/
|
||||
|
||||
package isabelle.dof
|
||||
|
||||
import isabelle._
|
||||
|
||||
|
||||
object DOF_Mkroot
|
||||
{
|
||||
/** mkroot **/
|
||||
|
||||
val default_ontologies: List[String] = List("technical_report", "scholarly_paper")
|
||||
val default_template = "scrreprt-modern"
|
||||
|
||||
def mkroot(
|
||||
session_name: String = "",
|
||||
session_dir: Path = Path.current,
|
||||
session_parent: String = "",
|
||||
init_repos: Boolean = false,
|
||||
template: String = "",
|
||||
ontologies: List[String] = List(),
|
||||
ontologies: List[String] = default_ontologies,
|
||||
template: String = default_template,
|
||||
quiet: Boolean = false,
|
||||
progress: Progress = new Progress): Unit =
|
||||
{
|
||||
Isabelle_System.make_directory(session_dir)
|
||||
|
||||
val name = proper_string(session_name) getOrElse session_dir.absolute_file.getName
|
||||
val parent = proper_string(session_parent) getOrElse Isabelle_System.getenv("ISABELLE_LOGIC")
|
||||
|
||||
val root_path = session_dir + Sessions.ROOT
|
||||
if (root_path.file.exists) error("Cannot overwrite existing " + root_path)
|
||||
|
@ -62,47 +62,54 @@ object DOF_Mkroot
|
|||
val document_path = session_dir + Path.explode("document")
|
||||
if (document_path.file.exists) error("Cannot overwrite existing " + document_path)
|
||||
|
||||
progress.echo("\nPreparing session " + quote(name) + " in " + session_dir)
|
||||
progress.echo(
|
||||
(if (quiet) "" else "\n") +
|
||||
"Initializing session " + quote(name) + " in " + session_dir.absolute)
|
||||
|
||||
|
||||
/* ROOT */
|
||||
|
||||
progress.echo(" creating " + root_path)
|
||||
progress.echo_if(!quiet, " creating " + root_path)
|
||||
|
||||
File.write(root_path,
|
||||
"session " + Mkroot.root_name(name) + " = " + Mkroot.root_name(parent) + """ +
|
||||
options [document = pdf, document_output = "output", document_build = dof, dof_ontologies = """"
|
||||
+ ontologies.mkString(" ") + """", dof_template = """ + Mkroot.root_name(template)
|
||||
+ """, document_comment_latex=true]
|
||||
"session " + Mkroot.root_name(name) + " = " + Mkroot.root_name(DOF.session) + """ +
|
||||
options [document = pdf, document_output = "output", document_build = dof]
|
||||
(*theories [document = false]
|
||||
A
|
||||
B*)
|
||||
theories
|
||||
""" + Mkroot.root_name(name) + """
|
||||
""" + Mkroot.root_name(name) + """
|
||||
document_files
|
||||
"preamble.tex"
|
||||
""")
|
||||
|
||||
val thy = session_dir + Path.explode(name+".thy")
|
||||
progress.echo(" creating " + thy)
|
||||
val thy = session_dir + Path.explode(name + ".thy")
|
||||
progress.echo_if(!quiet, " creating " + thy)
|
||||
File.write(thy,
|
||||
"theory\n " + name + "\nimports\n " + ontologies.mkString("\n ") + """
|
||||
"theory\n " + name +
|
||||
"\nimports\n " + ontologies.map("Isabelle_DOF." + _).mkString("\n ") + """
|
||||
begin
|
||||
|
||||
|
||||
use_template """ + quote(template) + """
|
||||
use_ontology """ + ontologies.map(quote).mkString(" and ") + """
|
||||
|
||||
end
|
||||
""")
|
||||
|
||||
|
||||
/* preamble */
|
||||
|
||||
val preamble_tex = session_dir + Path.explode("document/preamble.tex")
|
||||
progress.echo(" creating " + preamble_tex)
|
||||
progress.echo_if(!quiet, " creating " + preamble_tex)
|
||||
Isabelle_System.make_directory(preamble_tex.dir)
|
||||
File.write(preamble_tex,"""%% This is a placeholder for user-specific configuration and packages.""")
|
||||
|
||||
|
||||
/* Mercurial repository */
|
||||
|
||||
if (init_repos) {
|
||||
progress.echo(" \nInitializing Mercurial repository " + session_dir)
|
||||
progress.echo(
|
||||
(if (quiet) "" else "\n") + "Initializing Mercurial repository " + session_dir.absolute)
|
||||
|
||||
val hg = Mercurial.init_repository(session_dir)
|
||||
|
||||
|
@ -130,7 +137,7 @@ syntax: regexp
|
|||
|
||||
{
|
||||
val print_dir = session_dir.implode
|
||||
progress.echo("""
|
||||
progress.echo_if(!quiet, """
|
||||
Now use the following command line to build the session:
|
||||
|
||||
isabelle build -D """ +
|
||||
|
@ -139,42 +146,41 @@ Now use the following command line to build the session:
|
|||
}
|
||||
|
||||
|
||||
|
||||
/** Isabelle tool wrapper **/
|
||||
|
||||
val isabelle_tool = Isabelle_Tool("dof_mkroot", "prepare session root directory for Isabelle/DOF",
|
||||
val isabelle_tool = Isabelle_Tool("dof_mkroot", "create session root directory for Isabelle/DOF",
|
||||
Scala_Project.here, args =>
|
||||
{
|
||||
var init_repos = false
|
||||
var help = false
|
||||
var session_name = ""
|
||||
var session_parent = "Isabelle_DOF"
|
||||
var ontologies:List[String] = List()
|
||||
var template = session_parent + ".scrartcl"
|
||||
val default_ontologies = List(session_parent+".scholarly_paper")
|
||||
var ontologies = default_ontologies
|
||||
var template = default_template
|
||||
var quiet = false
|
||||
|
||||
val getopts = Getopts("""
|
||||
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
||||
|
||||
Options are:
|
||||
-I init Mercurial repository and add generated files
|
||||
-h print help
|
||||
-n NAME alternative session name (default: directory base name)
|
||||
-o ONTOLOGY ontology (default: scholarly_paper)
|
||||
-t TEMPLATE tempalte (default: scrartcl)
|
||||
-o NAMES list of ontologies, separated by blanks
|
||||
(default: """ + quote(Word.implode(default_ontologies)) + """)
|
||||
-q quiet mode: less verbosity
|
||||
-t NAME template (default: """ + quote(default_template) + """)
|
||||
|
||||
Prepare session root directory (default: current directory).
|
||||
Create session root directory for Isabelle/DOF (default: current directory).
|
||||
""",
|
||||
"I" -> (arg => init_repos = true),
|
||||
"I" -> (_ => init_repos = true),
|
||||
"h" -> (_ => help = true),
|
||||
"n:" -> (arg => session_name = arg),
|
||||
"o:" -> (arg => ontologies = ontologies :+ arg),
|
||||
"t:" -> (arg => template = arg),
|
||||
"h" -> (arg => help = true)
|
||||
)
|
||||
"o:" -> (arg => ontologies = Word.explode(arg)),
|
||||
"q" -> (_ => quiet = true),
|
||||
"t:" -> (arg => template = arg))
|
||||
|
||||
val more_args = getopts(args)
|
||||
|
||||
ontologies = if (ontologies.isEmpty) default_ontologies else ontologies
|
||||
|
||||
if (help) {getopts.usage()} else {()}
|
||||
val session_dir =
|
||||
more_args match {
|
||||
case Nil => Path.current
|
||||
|
@ -182,7 +188,12 @@ Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
|||
case _ => getopts.usage()
|
||||
}
|
||||
|
||||
mkroot(session_parent=session_parent, session_name = session_name, session_dir = session_dir, init_repos = init_repos,
|
||||
ontologies = ontologies, template = template, progress = new Console_Progress)
|
||||
if (help) getopts.usage()
|
||||
|
||||
val progress = new Console_Progress
|
||||
|
||||
mkroot(session_name = session_name, session_dir = session_dir,
|
||||
init_repos = init_repos, quiet = quiet, progress = progress,
|
||||
ontologies = ontologies, template = template)
|
||||
})
|
||||
}
|
||||
|
|
|
@ -28,6 +28,8 @@
|
|||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*/
|
||||
|
||||
|
||||
/*** command-line tools for Isabelle/DOF ***/
|
||||
|
||||
package isabelle.dof
|
||||
|
||||
|
@ -35,5 +37,6 @@ import isabelle._
|
|||
|
||||
|
||||
class DOF_Tools extends Isabelle_Scala_Tools(
|
||||
DOF_Mkroot.isabelle_tool
|
||||
DOF.isabelle_tool,
|
||||
DOF_Mkroot.isabelle_tool,
|
||||
)
|
||||
|
|
|
@ -25,25 +25,26 @@ print_doc_items
|
|||
|
||||
(* this corresponds to low-level accesses : *)
|
||||
ML\<open>
|
||||
val {docobj_tab={tab = docitem_tab, ...},docclass_tab, ISA_transformer_tab, monitor_tab,...}
|
||||
val docitem_tab = DOF_core.get_instances \<^context>
|
||||
val {docclass_tab, ISA_transformer_tab, monitor_tab,...}
|
||||
= DOF_core.get_data @{context};
|
||||
Symtab.dest docitem_tab;
|
||||
Name_Space.dest_table docitem_tab;
|
||||
Symtab.dest docclass_tab;
|
||||
\<close>
|
||||
ML\<open>
|
||||
#value(the(the(Symtab.lookup docitem_tab "aaa")))
|
||||
|
||||
val (oid, DOF_core.Instance {value, ...}) =
|
||||
Name_Space.check (Context.Proof \<^context>) (DOF_core.get_instances \<^context>) ("aaa", Position.none)
|
||||
\<close>
|
||||
|
||||
find_theorems (60) name:"Conceptual.M."
|
||||
|
||||
value [simp]"trace(M.make undefined [] ())"
|
||||
value "ok(M.make undefined_AAA [] ())"
|
||||
value "trace(M.make undefined_AAA [] ())"
|
||||
value "tag_attribute(M.make undefined_AAA [] ())"
|
||||
value [simp]"M.trace(M.make undefined [] ())"
|
||||
value "M.ok(M.make undefined_AAA [] ())"
|
||||
value "M.trace(M.make undefined_AAA [] ())"
|
||||
value "M.tag_attribute(M.make undefined_AAA [] ())"
|
||||
|
||||
|
||||
value "ok(M.make 0 [] ())"
|
||||
value "M.ok(M.make 0 [] ())"
|
||||
(*
|
||||
value "ok(M.make undefined [] ())"
|
||||
value "ok(M.make 0 [] undefined)"
|
||||
|
@ -124,11 +125,11 @@ DOF_core.get_attribute_info "Conceptual.C" "z" @{theory};
|
|||
|
||||
|
||||
ML\<open>
|
||||
DOF_core.get_value_local "sdf" @{context};
|
||||
DOF_core.get_value_local "sdfg" @{context};
|
||||
DOF_core.get_value_local "xxxy" @{context};
|
||||
DOF_core.get_value_local "dfgdfg" @{context};
|
||||
DOF_core.get_value_local "omega" @{context};
|
||||
DOF_core.get_value_local ("sdf", Position.none) @{context};
|
||||
DOF_core.get_value_local ("sdfg", Position.none) @{context};
|
||||
DOF_core.get_value_local ("xxxy", Position.none) @{context};
|
||||
DOF_core.get_value_local ("dfgdfg", Position.none) @{context};
|
||||
DOF_core.get_value_local ("omega", Position.none) @{context};
|
||||
\<close>
|
||||
|
||||
text\<open>A not too trivial test: default y -> [].
|
||||
|
@ -168,14 +169,22 @@ update_instance*[omega::E, y+="[''en'']"]
|
|||
ML\<open> val s = map HOLogic.dest_string (HOLogic.dest_list @{docitem_attribute y::omega}); \<close>
|
||||
|
||||
subsection\<open> Example text antiquotation:\<close>
|
||||
text\<open> @{docitem_attribute omega::y} \<close>
|
||||
text\<open> @{docitem_attribute y::omega} \<close>
|
||||
|
||||
|
||||
section\<open>Simulation of a Monitor\<close>
|
||||
|
||||
declare[[free_class_in_monitor_checking]]
|
||||
|
||||
ML\<open>
|
||||
val thy = \<^theory>
|
||||
val long_cid = "Isa_COL.figure_group"
|
||||
val t = DOF_core.get_doc_class_global long_cid thy
|
||||
\<close>
|
||||
open_monitor*[figs1::figure_group,
|
||||
caption="''Sample ''"]
|
||||
|
||||
ML\<open>val monitor_infos = DOF_core.get_monitor_infos \<^context>\<close>
|
||||
text*[testFreeA::A]\<open>\<close>
|
||||
figure*[fig_A::figure, spawn_columns=False,
|
||||
relative_width="90",
|
||||
src="''figures/A.png''"]
|
||||
|
@ -185,14 +194,66 @@ figure*[fig_B::figure,
|
|||
spawn_columns=False,relative_width="90",
|
||||
src="''figures/B.png''"]
|
||||
\<open> The B train \ldots \<close>
|
||||
open_monitor*[figs2::figure_group,
|
||||
caption="''Sample ''"]
|
||||
ML\<open>val monitor_infos = DOF_core.get_monitor_infos \<^context>\<close>
|
||||
figure*[fig_C::figure, spawn_columns=False,
|
||||
relative_width="90",
|
||||
src="''figures/A.png''"]
|
||||
\<open> The C train \ldots \<close>
|
||||
open_monitor*[figs3::figure_group,
|
||||
caption="''Sample ''"]
|
||||
ML\<open>val monitor_infos = DOF_core.get_monitor_infos \<^context>\<close>
|
||||
|
||||
figure*[fig_D::figure,
|
||||
spawn_columns=False,relative_width="90",
|
||||
src="''figures/B.png''"]
|
||||
\<open> The D train \ldots \<close>
|
||||
close_monitor*[figs3]
|
||||
|
||||
open_monitor*[figs4::figure_group,
|
||||
caption="''Sample ''"]
|
||||
ML\<open>val monitor_infos = DOF_core.get_monitor_infos \<^context>\<close>
|
||||
|
||||
text*[testRejected1::figure_group, caption="''figures/A.png''"]
|
||||
\<open> The A train \ldots \<close>
|
||||
|
||||
figure*[fig_E::figure,
|
||||
spawn_columns=False,relative_width="90",
|
||||
src="''figures/B.png''"]
|
||||
\<open> The E train \ldots \<close>
|
||||
close_monitor*[figs4]
|
||||
close_monitor*[figs2]
|
||||
text*[testRejected2::figure_group, caption="''figures/A.png''"]
|
||||
\<open> The A train \ldots \<close>
|
||||
|
||||
close_monitor*[figs1]
|
||||
|
||||
declare[[free_class_in_monitor_checking = false]]
|
||||
|
||||
text\<open>Resulting trace of figs1 as ML antiquotation: \<close>
|
||||
ML \<open>@{trace_attribute figs1}\<close>
|
||||
text\<open>Resulting trace of figs as text antiquotation:\<close>
|
||||
text\<open>@{trace_attribute figs1}\<close>
|
||||
|
||||
text\<open>Test trace_attribute term antiquotation:\<close>
|
||||
|
||||
term*\<open>map snd @{trace-attribute \<open>figs1\<close>}\<close>
|
||||
value*\<open>map snd @{trace-attribute \<open>figs1\<close>}\<close>
|
||||
term*\<open>map fst @{trace-attribute \<open>aaa\<close>}\<close>
|
||||
value*\<open>map fst @{trace-attribute \<open>aaa\<close>}\<close>
|
||||
term*\<open>map fst @{trace-attribute \<open>test_monitor_M\<close>}\<close>
|
||||
value*\<open>map fst @{trace-attribute \<open>test_monitor_M\<close>}\<close>
|
||||
|
||||
definition example_expression where "example_expression \<equiv> \<lbrace>\<lfloor>''Conceptual.A''\<rfloor> || \<lfloor>''Conceptual.F''\<rfloor>\<rbrace>\<^sup>*"
|
||||
value* \<open> DA.accepts (na2da (rexp2na example_expression)) (map fst @{trace-attribute \<open>aaa\<close>}) \<close>
|
||||
|
||||
definition word_test :: "'a list \<Rightarrow> 'a rexp \<Rightarrow> bool" (infix "is-in" 60)
|
||||
where " w is-in rexp \<equiv> DA.accepts (na2da (rexp2na rexp)) (w)"
|
||||
|
||||
value* \<open> (map fst @{trace-attribute \<open>aaa\<close>}) is-in example_expression \<close>
|
||||
|
||||
|
||||
(*<*)
|
||||
text\<open>Final Status:\<close>
|
||||
print_doc_items
|
||||
|
|
|
@ -44,7 +44,7 @@ subsection*[b::A, x = "5"] \<open> Lorem ipsum dolor sit amet, ... \<close>
|
|||
|
||||
text\<open>Setting a sample invariant, referring to attribute value "x":\<close>
|
||||
ML\<open>fun check_A_invariant oid {is_monitor:bool} ctxt =
|
||||
let val term = AttributeAccess.compute_attr_access ctxt "x" oid @{here} @{here}
|
||||
let val term = ISA_core.compute_attr_access ctxt "x" oid NONE @{here}
|
||||
val (@{typ "int"},x_value) = HOLogic.dest_number term
|
||||
in if x_value > 5 then error("class A invariant violation") else true end
|
||||
\<close>
|
||||
|
@ -80,8 +80,11 @@ to take sub-classing into account:
|
|||
\<close>
|
||||
|
||||
ML\<open>fun check_M_invariant oid {is_monitor} ctxt =
|
||||
let val term = AttributeAccess.compute_attr_access ctxt "trace" oid @{here} @{here}
|
||||
fun conv (Const(@{const_name "Pair"},_) $ Const(s,_) $ S) = (s, HOLogic.dest_string S)
|
||||
let val term = ISA_core.compute_attr_access ctxt "trace" oid NONE @{here}
|
||||
fun conv (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
|
||||
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
|
||||
let val s' = DOF_core.read_cid (Context.proof_of ctxt) (HOLogic.dest_string s)
|
||||
in (s', HOLogic.dest_string S) end
|
||||
val string_pair_list = map conv (HOLogic.dest_list term)
|
||||
val cid_list = map fst string_pair_list
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
|
@ -113,10 +116,15 @@ subsection*[f::E] \<open> Lectus accumsan velit ultrices, ... }\<
|
|||
section*[f2::E] \<open> Lectus accumsan velit ultrices, ... }\<close>
|
||||
*)
|
||||
|
||||
ML\<open>val term = AttributeAccess.compute_attr_access
|
||||
(Context.Proof @{context}) "trace" "struct" @{here} @{here} ;
|
||||
ML\<open>val ctxt = @{context}
|
||||
val term = ISA_core.compute_attr_access
|
||||
(Context.Proof ctxt) "trace" "struct" NONE @{here} ;
|
||||
fun conv (Const(@{const_name "Pair"},_) $ Const(s,_) $ S) = (s, HOLogic.dest_string S)
|
||||
val string_pair_list = map conv (HOLogic.dest_list term)
|
||||
fun conv' (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
|
||||
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
|
||||
let val s' = DOF_core.read_cid ctxt (HOLogic.dest_string s)
|
||||
in (s', HOLogic.dest_string S) end
|
||||
val string_pair_list = map conv' (HOLogic.dest_list term)
|
||||
\<close>
|
||||
|
||||
|
||||
|
|
|
@ -11,10 +11,9 @@ begin
|
|||
|
||||
section\<open>\<^theory_text>\<open>ML*\<close>-Annotated SML-commands\<close>
|
||||
ML*[the_function::B,x=\<open>\<open>dfg\<close>\<close>]\<open>fun fac x = if x = 0 then 1 else x * fac(x-1);
|
||||
val t = @{const_name "List.Nil"}\<close>
|
||||
val t = \<^value_>\<open>x @{B \<open>the_function\<close>}\<close>\<close>
|
||||
ML\<open>fac 5; t\<close> \<comment> \<open>this is a test that ML* is actually evaluated and the
|
||||
resulting toplevel state is preserved.\<close>
|
||||
ML*\<open>3+4\<close> \<comment> \<open>meta-args are optional\<close>
|
||||
|
||||
text-macro*[the::C]\<open> @{B [display] "the_function"} \<close>
|
||||
|
||||
|
@ -207,4 +206,8 @@ text\<open>... and here we reference @{A [display] \<open>assertionA\<close>}.\<
|
|||
|
||||
assert*\<open>evidence @{result \<open>resultProof\<close>} = evidence @{result \<open>resultProof2\<close>}\<close>
|
||||
|
||||
text\<open>The optional evaluator of \<open>value*\<close> and \<open>assert*\<close> must be specified after the meta arguments:\<close>
|
||||
value* [optional_test_A::A, x=6] [nbe] \<open>filter (\<lambda>\<sigma>. A.x \<sigma> > 5) @{A-instances}\<close>
|
||||
assert* [resultProof3::result, evidence = "proof", property="[@{thm \<open>HOL.sym\<close>}]"] [nbe]
|
||||
\<open>evidence @{result \<open>resultProof3\<close>} = evidence @{result \<open>resultProof2\<close>}\<close>
|
||||
end
|
||||
|
|
|
@ -12,7 +12,7 @@ text\<open>
|
|||
theory attribute must be set:\<close>
|
||||
|
||||
|
||||
declare[[invariants_checking = true]]
|
||||
declare[[invariants_strict_checking = true]]
|
||||
|
||||
text\<open>For example, let's define the following two classes:\<close>
|
||||
|
||||
|
@ -47,6 +47,42 @@ text\<open>
|
|||
it inherits @{doc_class class_inv1} invariants.
|
||||
Hence the \<^term>\<open>int1\<close> invariant is checked when the instance @{docitem testinv2} is defined.\<close>
|
||||
|
||||
text\<open>Test invariant for attributes of attributes: \<close>
|
||||
|
||||
doc_class inv_test1 =
|
||||
a :: int
|
||||
|
||||
doc_class inv_test2 =
|
||||
b :: "inv_test1"
|
||||
c:: int
|
||||
invariant inv_test2 :: "c \<sigma> = 1"
|
||||
invariant inv_test2' :: "a (b \<sigma>) = 2"
|
||||
|
||||
doc_class inv_test3 = inv_test1 +
|
||||
b :: "inv_test1"
|
||||
c:: int
|
||||
invariant inv_test3 :: "a \<sigma> = 2"
|
||||
invariant inv_test3' :: "a (b \<sigma>) = 2"
|
||||
|
||||
doc_class inv_test4 = inv_test2 +
|
||||
d :: "inv_test3"
|
||||
invariant inv_test4 :: "a (inv_test2.b \<sigma>) = 2"
|
||||
invariant inv_test4' :: "a (d \<sigma>) = 2"
|
||||
|
||||
text*[inv_test1_instance::inv_test1, a=2]\<open>\<close>
|
||||
text*[inv_test3_instance::inv_test3, a=2, b="@{inv-test1 \<open>inv_test1_instance\<close>}" ]\<open>\<close>
|
||||
text*[inv_test4_instance::inv_test4, b="@{inv-test1 \<open>inv_test1_instance\<close>}"
|
||||
, c=1, d="@{inv-test3 \<open>inv_test3_instance\<close>}"]\<open>\<close>
|
||||
|
||||
text\<open>To support invariant on attributes in attributes
|
||||
and invariant on attributes of the superclasses,
|
||||
we check that the type of the attribute of the subclass is ground:\<close>
|
||||
ML\<open>
|
||||
val Type(st, [ty]) = \<^typ>\<open>inv_test1\<close>
|
||||
val Type(st', [ty']) = \<^typ>\<open>'a inv_test1_scheme\<close>
|
||||
val t = ty = \<^typ>\<open>unit\<close>
|
||||
\<close>
|
||||
|
||||
text\<open>Now assume the following ontology:\<close>
|
||||
|
||||
doc_class title =
|
||||
|
@ -144,6 +180,6 @@ value*\<open>evidence @{result \<open>resultProof\<close>} = evidence @{result \
|
|||
|
||||
declare[[invariants_checking_with_tactics = false]]
|
||||
|
||||
declare[[invariants_checking = false]]
|
||||
declare[[invariants_strict_checking = false]]
|
||||
|
||||
end
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
chapter \<open>Notes on Isabelle/DOF for Isabelle2021-1\<close>
|
||||
chapter \<open>Notes on Isabelle/DOF for Isabelle2022\<close>
|
||||
|
||||
theory "Isabelle2021-1"
|
||||
theory "Isabelle2022"
|
||||
imports Main
|
||||
begin
|
||||
|
||||
|
@ -44,33 +44,9 @@ text \<open>
|
|||
we see more and more alternatives, e.g. system options or services in
|
||||
Isabelle/Scala (see below).
|
||||
|
||||
\<^item> \<^file>\<open>$ISABELLE_DOF_HOME/etc/options\<close>
|
||||
|
||||
\<^item> options declared as \<^verbatim>\<open>public\<close> appear in the Isabelle/jEdit dialog
|
||||
\<^action>\<open>plugin-options\<close> (according to their \<^verbatim>\<open>section\<close>)
|
||||
|
||||
\<^item> all options (public and non-public) are available for command-line
|
||||
usage, e.g. \<^verbatim>\<open>isabelle build -o dof_url="..."\<close>
|
||||
|
||||
\<^item> access to options in Isabelle/ML:
|
||||
|
||||
\<^item> implicit (for the running ML session)
|
||||
\<^ML>\<open>Options.default_string \<^system_option>\<open>dof_url\<close>\<close>
|
||||
|
||||
\<^item> explicit (e.g. for each theories section in
|
||||
\<^file>\<open>$ISABELLE_HOME/src/Pure/Tools/build.ML\<close>):
|
||||
\<^ML>\<open>fn options => Options.string options \<^system_option>\<open>dof_url\<close>\<close>
|
||||
|
||||
\<^item> access in Isabelle/Scala is always explicit; the initial options
|
||||
should be created only once and passed around as explicit argument:
|
||||
|
||||
\<^scala>\<open>{
|
||||
val options = isabelle.Options.init();
|
||||
options.string("dof_url");
|
||||
}\<close>
|
||||
|
||||
Note: there are no antiquotations in Isabelle/Scala, so the literal
|
||||
string \<^scala>\<open>"dof_url"\<close> is unchecked.
|
||||
\<^item> \<^path>\<open>$ISABELLE_DOF_HOME/etc/options\<close> should not be used for regular
|
||||
Isabelle/DOF applications: thus it works properly within Isabelle/AFP,
|
||||
where the component context is missing.
|
||||
\<close>
|
||||
|
||||
|
||||
|
@ -107,13 +83,8 @@ subsection \<open>Document commands\<close>
|
|||
|
||||
text \<open>
|
||||
Isar toplevel commands now support a uniform concept for
|
||||
\<^ML_type>\<open>Toplevel.presentation\<close>, but the exported interfaces are
|
||||
limited to commands that do not change the semantic state: see
|
||||
\<^ML_type>\<open>Toplevel.presentation\<close>, e.g. see
|
||||
\<^ML>\<open>Toplevel.present\<close> and \<^ML>\<open>Toplevel.present_local_theory\<close>.
|
||||
|
||||
Since \<^verbatim>\<open>Toplevel.present_theory\<close> is missing in Isabelle2021-1, we use a
|
||||
workaround with an alternative presentation hook: it exports
|
||||
\<^verbatim>\<open>document/latex_dof\<close> files instead of regular \<^verbatim>\<open>document/latex_dof\<close>.
|
||||
\<close>
|
||||
|
||||
|
||||
|
@ -174,10 +145,7 @@ section \<open>Miscellaneous NEWS and Notes\<close>
|
|||
|
||||
text \<open>
|
||||
\<^item> Document preparation: there are many new options etc. that might help
|
||||
to fine-tune DOF output, e.g. \<^system_option>\<open>document_comment_latex\<close>.
|
||||
|
||||
\<^item> ML: Theory_Data / Generic_Data: "val extend = I" has been removed;
|
||||
obsolete since Isabelle2021.
|
||||
to fine-tune DOF output, e.g. \<^system_option>\<open>document_heading_prefix\<close>.
|
||||
|
||||
\<^item> ML: \<^ML>\<open>Thm.instantiate\<close> and related operations now use explicit
|
||||
abstract types for the instantiate, see \<^file>\<open>~~/src/Pure/term_items.ML\<close>
|
||||
|
@ -196,6 +164,8 @@ text \<open>
|
|||
example:
|
||||
|
||||
\<^ML>\<open>\<^Type>\<open>nat\<close>\<close>
|
||||
\<^ML>\<open>\<^Type>\<open>prod \<^Type>\<open>int\<close> \<^Type>\<open>int\<close>\<close>\<close>
|
||||
|
||||
\<^ML>\<open>fn (A, B) => \<^Type>\<open>fun A B\<close>\<close>
|
||||
\<^ML>\<open>\<^Type_fn>\<open>fun A B => \<open>(A, B)\<close>\<close>\<close>
|
||||
|
|
@ -28,10 +28,11 @@ print_doc_classes
|
|||
print_doc_items
|
||||
|
||||
(* this corresponds to low-level accesses : *)
|
||||
ML\<open>
|
||||
val {docobj_tab={tab = docitem_tab, ...},docclass_tab, ISA_transformer_tab, monitor_tab,...}
|
||||
ML\<open>
|
||||
val docitem_tab = DOF_core.get_instances \<^context>
|
||||
val {docclass_tab, ISA_transformer_tab, ...}
|
||||
= DOF_core.get_data @{context};
|
||||
Symtab.dest docitem_tab;
|
||||
Name_Space.dest_table docitem_tab;
|
||||
Symtab.dest docclass_tab;
|
||||
app;
|
||||
\<close>
|
||||
|
@ -67,7 +68,7 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark
|
|||
val strg = XML.string_of (hd (Latex.output text))
|
||||
val file = {path = Path.make [oid ^ "_snippet.tex"],
|
||||
pos = @{here},
|
||||
content = strg}
|
||||
content = Bytes.string strg}
|
||||
|
||||
val _ = Generated_Files.write_file (Path.make ["latex_test"])
|
||||
file
|
||||
|
@ -80,7 +81,7 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark
|
|||
(* ... generating the level-attribute syntax *)
|
||||
in
|
||||
( Value_Command.Docitem_Parser.create_and_check_docitem
|
||||
{is_monitor = false} {is_inline = false}
|
||||
{is_monitor = false} {is_inline = false} {define = true}
|
||||
oid pos (cid_transform cid_pos) (attr_transform doc_attrs)
|
||||
#> (fn thy => (app (check_n_tex_text thy) toks_list; thy)))
|
||||
end;
|
||||
|
@ -125,7 +126,7 @@ val _ =
|
|||
|
||||
\<close>
|
||||
|
||||
|
||||
ML\<open>open Bytes\<close>
|
||||
text\<open>And here a tex - text macro.\<close>
|
||||
text\<open>Pythons ReStructuredText (RST).
|
||||
@{url \<open>https://de.wikipedia.org/wiki/ReStructuredText\<close>}. Tool: Sphinx.
|
||||
|
@ -352,8 +353,8 @@ Config.get ;
|
|||
|
||||
(*
|
||||
\begin{figure}[h]
|
||||
|
||||
\centering
|
||||
|
||||
\includegraphics[scale=0.5]{graph_a}
|
||||
\caption{An example graph}
|
||||
|
||||
|
@ -362,35 +363,37 @@ Config.get ;
|
|||
|
||||
|
||||
\begin{figure}
|
||||
|
||||
\centering
|
||||
|
||||
\begin{subfigure}[b]{0.3\textwidth}
|
||||
|
||||
\centering
|
||||
\includegraphics[width=\textwidth]{graph1}
|
||||
\caption{$y=x$}
|
||||
|
||||
\label{fig:y equals x}
|
||||
\label{fig:y equals x} (* PROBLEM *)
|
||||
\end{subfigure}
|
||||
|
||||
\hfill
|
||||
|
||||
\begin{subfigure}[b]{0.3\textwidth}
|
||||
|
||||
\centering
|
||||
\includegraphics[width=\textwidth]{graph2}
|
||||
\caption{$y=3sinx$}
|
||||
|
||||
\label{fig:three sin x}
|
||||
\label{fig:three sin x} (* PROBLEM *)
|
||||
\end{subfigure}
|
||||
|
||||
\hfill
|
||||
|
||||
\begin{subfigure}[b]{0.3\textwidth}
|
||||
|
||||
\centering
|
||||
\includegraphics[width=\textwidth]{graph3}
|
||||
\caption{$y=5/x$}
|
||||
|
||||
\label{fig:five over x}
|
||||
\label{fig:five over x} (* PROBLEM *)
|
||||
\end{subfigure}
|
||||
|
||||
\caption{Three simple graphs}
|
||||
|
@ -436,7 +439,7 @@ fun figure_antiq (check: Proof.context -> Path.T option -> Input.source -> Path.
|
|||
else error "scale out of range (must be between 1 and 100"
|
||||
val fig_args = Library.enclose "[" "]" (commas [wdth_ltx,scl_ltx])
|
||||
val _ = writeln cap
|
||||
fun proportion () = "0."^ (Int.toString (100 div length(!figure_proportions)))
|
||||
fun proportion () = "0."^ (Int.toString (100 div List.length(!figure_proportions)))
|
||||
\<comment> \<open>naive: assumes equal proportions\<close>
|
||||
fun core arg = "\n\\centering\n"
|
||||
^"\\includegraphics"
|
||||
|
@ -502,7 +505,8 @@ Figure*[fff::figure,src="\<open>this is a side-by-side\<close>"]
|
|||
\<close>
|
||||
|
||||
|
||||
Figure*[ffff::figure,(* caption *) src="\<open>this is another 2 side-by-side\<close>"]
|
||||
|
||||
Figure*[ffff::figure2, caption="\<open>this is another 2 side-by-side\<close>"]
|
||||
\<open>@{figure_content [width=40, scale=35, caption="This is a left test"] "../ROOT"}\<close>
|
||||
\<open>@{figure_content [width=40, scale=35, caption="This is a right test"] "../ROOT"}\<close>
|
||||
|
||||
|
@ -510,5 +514,18 @@ Figure*[ffff::figure,(* caption *) src="\<open>this is another 2 side-by-side\<c
|
|||
text\<open> @{figure "ffff(2)"}\<close>
|
||||
*)
|
||||
|
||||
Figure*[figxxx::figure2,caption="\<open>Proofs establishing an Invariant Preservation.\<close>"]
|
||||
\<open>@{boxed_theory_text [display]
|
||||
\<open>lemma inv_c2_preserved : "c2_inv \<sigma> \<Longrightarrow> c1_inv (\<sigma> \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e)"
|
||||
unfolding c1_inv_def c2_inv_def
|
||||
Computer_Hardware_to_Hardware_morphism_def
|
||||
Product_to_Component_morphism_def
|
||||
by (auto simp: comp_def)
|
||||
|
||||
lemma Computer_Hardware_to_Hardware_total :
|
||||
"Computer_Hardware_to_Hardware_morphism ` ({X. c2_inv X})
|
||||
\<subseteq> ({X::Hardware. c1_inv X})"
|
||||
using inv_c2_preserved by auto\<close>}\<close>
|
||||
|
||||
end
|
||||
(*>*)
|
||||
|
|
|
@ -7,6 +7,7 @@ session "Isabelle_DOF-tests" = "Isabelle_DOF" +
|
|||
"TermAntiquotations"
|
||||
"Attributes"
|
||||
"Evaluation"
|
||||
"Isabelle2021-1"
|
||||
"High_Level_Syntax_Invariants"
|
||||
"Ontology_Matching_Example"
|
||||
theories [condition = ISABELLE_DOF_HOME]
|
||||
"Isabelle2022"
|
||||
|
|
|
@ -41,8 +41,9 @@ ODL on a paradigmatical example.
|
|||
|
||||
|
||||
text\<open>Voila the content of the Isabelle_DOF environment so far:\<close>
|
||||
ML\<open>
|
||||
val {docobj_tab={tab = x, ...},docclass_tab, ISA_transformer_tab,...} = DOF_core.get_data @{context};
|
||||
ML\<open>
|
||||
val x = DOF_core.get_instances \<^context>
|
||||
val {docclass_tab, ISA_transformer_tab,...} = DOF_core.get_data @{context};
|
||||
Symtab.dest ISA_transformer_tab;
|
||||
\<close>
|
||||
|
||||
|
@ -91,11 +92,12 @@ update_instance*[xcv4::F, b+="{(@{docitem ''xcv3''},@{docitem ''xcv5''})}"]
|
|||
text\<open>And here is the results of some ML-term antiquotations:\<close>
|
||||
ML\<open> @{docitem_attribute b::xcv4} \<close>
|
||||
ML\<open> @{docitem xcv4} \<close>
|
||||
ML\<open> @{docitem_name xcv4} \<close>
|
||||
ML\<open> @{trace_attribute aaa} \<close>
|
||||
|
||||
text\<open>Now we might need to reference a class instance in a term command and we would like
|
||||
Isabelle to check that this instance is indeed an instance of this class.
|
||||
Here, we want to reference the instance @{docitem \<open>xcv4\<close>} previously defined.
|
||||
Here, we want to reference the instance @{docitem_name "xcv4"} previously defined.
|
||||
We can use the term* command which extends the classic term command
|
||||
and does the appropriate checking.\<close>
|
||||
term*\<open>@{F \<open>xcv4\<close>}\<close>
|
||||
|
@ -104,15 +106,37 @@ text\<open>We can also reference an attribute of the instance.
|
|||
Here we reference the attribute r of the class F which has the type @{typ \<open>thm list\<close>}.\<close>
|
||||
term*\<open>r @{F \<open>xcv4\<close>}\<close>
|
||||
|
||||
term \<open>@{A \<open>xcv2\<close>}\<close>
|
||||
|
||||
text\<open>We declare a new text element. Note that the class name contains an underscore "_".\<close>
|
||||
text*[te::text_element]\<open>Lorem ipsum...\<close>
|
||||
|
||||
text\<open>Unfortunately due to different lexical conventions for constant symbols and mixfix symbols
|
||||
this term antiquotation has to be denoted like this: @{term\<open>@{text-element \<open>ee\<close>}\<close>}.
|
||||
this term antiquotation has to be denoted like this: @{term_ \<open>@{text-element \<open>te\<close>}\<close>}.
|
||||
We need to substitute an hyphen "-" for the underscore "_".\<close>
|
||||
term*\<open>@{text-element \<open>te\<close>}\<close>
|
||||
|
||||
text\<open>Terms containing term antiquotations can be checked and evaluated
|
||||
using \<^theory_text>\<open>term_\<close> and \<^theory_text>\<open>value_\<close> text antiquotations respectively:
|
||||
We can print the term @{term_ \<open>r @{F \<open>xcv4\<close>}\<close>} with \<open>@{term_ \<open>r @{F \<open>xcv4\<close>}\<close>}\<close>
|
||||
or get the value of the \<^const>\<open>F.r\<close> attribute of @{docitem \<open>xcv4\<close>} with \<open>@{value_ \<open>r @{F \<open>xcv4\<close>}\<close>}\<close>
|
||||
\<^theory_text>\<open>value_\<close> may have an optional argument between square brackets to specify the evaluator but this
|
||||
argument must be specified after a default optional argument already defined
|
||||
by the text antiquotation implementation.
|
||||
So one must use the following syntax if he does not want to specify the first optional argument:
|
||||
\<open>@{value_ [] [nbe] \<open>r @{F \<open>xcv4\<close>}\<close>}\<close>. Note the empty brackets.
|
||||
|
||||
\<close>
|
||||
|
||||
text\<open>There also are \<^theory_text>\<open>term_\<close> and \<^theory_text>\<open>value_\<close> ML antiquotations:
|
||||
\<^ML>\<open>@{term_ \<open>r @{F \<open>xcv4\<close>}\<close>}\<close> will return the ML representation of the term \<^term_>\<open>r @{F \<open>xcv4\<close>}\<close>,
|
||||
and \<^ML>\<open>@{value_ \<open>r @{F \<open>xcv4\<close>}\<close>}\<close> will return the ML representation
|
||||
of the value of the \<^const>\<open>F.r\<close> attribute of @{docitem \<open>xcv4\<close>}.
|
||||
\<^theory_text>\<open>value_\<close> may have an optional argument between square brackets to specify the evaluator:
|
||||
\<close>
|
||||
|
||||
ML\<open>
|
||||
val t = @{term_ \<open>r @{F \<open>xcv4\<close>}\<close>}
|
||||
val tt = @{value_ [nbe] \<open>r @{F \<open>xcv4\<close>}\<close>}
|
||||
\<close>
|
||||
|
||||
end
|
||||
|
||||
|
|
Loading…
Reference in New Issue