forked from Isabelle_DOF/Isabelle_DOF
Compare commits
66 Commits
main
...
enable-inv
Author | SHA1 | Date |
---|---|---|
Nicolas Méric | 9bc493250f | |
Nicolas Méric | 0b2d28b547 | |
Nicolas Méric | 37d7ed7d17 | |
Nicolas Méric | 312734afbd | |
Burkhart Wolff | 8cee80d78e | |
Makarius Wenzel | ec0d525426 | |
Makarius Wenzel | 791990039b | |
Makarius Wenzel | 78d61390fe | |
Makarius Wenzel | ffcf1f3240 | |
Makarius Wenzel | 5471d873a9 | |
Makarius Wenzel | df37250a00 | |
Makarius Wenzel | 185daeb577 | |
Makarius Wenzel | 8037fd15f2 | |
Makarius Wenzel | afcd78610b | |
Makarius Wenzel | b8a9ef5118 | |
Makarius Wenzel | a4e75c8b12 | |
Makarius Wenzel | d20e9ccd22 | |
Makarius Wenzel | f2ee5d3780 | |
Makarius Wenzel | 44cae2e631 | |
Makarius Wenzel | 7b2bf35353 | |
Makarius Wenzel | e8c7fa6018 | |
Makarius Wenzel | b12e61511d | |
Makarius Wenzel | 3cac42e6cb | |
Makarius Wenzel | aee8ba1df1 | |
Makarius Wenzel | d93e1383d4 | |
Makarius Wenzel | 3d5d1e7476 | |
Makarius Wenzel | 4264e7cd15 | |
Makarius Wenzel | 96f4077c53 | |
Makarius Wenzel | d7fb39d7eb | |
Makarius Wenzel | b95826962f | |
Makarius Wenzel | 912d4bb49e | |
Makarius Wenzel | a6c1a2baa4 | |
Makarius Wenzel | bb5963c6e2 | |
Makarius Wenzel | cc3e2a51a4 | |
Makarius Wenzel | 9e4e5b49eb | |
Makarius Wenzel | b65ecbdbef | |
Makarius Wenzel | 3be2225dcf | |
Makarius Wenzel | f44f0af01c | |
Makarius Wenzel | 9a11baf840 | |
Makarius Wenzel | 48c167aa23 | |
Makarius Wenzel | 700a9bbfee | |
Makarius Wenzel | 73299941ad | |
Makarius Wenzel | 5a8c438c41 | |
Makarius Wenzel | 7772c73aaa | |
Makarius Wenzel | ca18453043 | |
Makarius Wenzel | 1a122b1a87 | |
Makarius Wenzel | 47d95c467e | |
Makarius Wenzel | bf3085d4c0 | |
Makarius Wenzel | 068e6e0411 | |
Makarius Wenzel | 09e9980691 | |
Makarius Wenzel | 94ce3fdec2 | |
Makarius Wenzel | 44819bff02 | |
Makarius Wenzel | a6ab1e101e | |
Makarius Wenzel | c29ec9641a | |
Nicolas Méric | 06833aa190 | |
Nicolas Méric | 4f0c7e1e95 | |
Nicolas Méric | 0040949cf8 | |
Nicolas Méric | e68c332912 | |
Burkhart Wolff | b2c4f40161 | |
Burkhart Wolff | 309952e0ce | |
Burkhart Wolff | 830e1b440a | |
Burkhart Wolff | 2149db9efc | |
Burkhart Wolff | 1547ace64b | |
Burkhart Wolff | 39acd61dfd | |
Burkhart Wolff | 29770b17ee | |
Achim D. Brucker | b4f4048cff |
|
@ -7,9 +7,10 @@ pipeline:
|
||||||
- export `isabelle getenv ISABELLE_HOME_USER`
|
- export `isabelle getenv ISABELLE_HOME_USER`
|
||||||
- mkdir -p $ISABELLE_HOME_USER/etc
|
- mkdir -p $ISABELLE_HOME_USER/etc
|
||||||
- echo "ISABELLE_PDFLATEX=\"$LATEX --file-line-error\"" >> $ISABELLE_HOME_USER/etc/settings
|
- echo "ISABELLE_PDFLATEX=\"$LATEX --file-line-error\"" >> $ISABELLE_HOME_USER/etc/settings
|
||||||
- isabelle components -u `pwd`
|
|
||||||
- isabelle build -D . -o browser_info
|
- isabelle build -D . -o browser_info
|
||||||
- isabelle dof_mkroot DOF_test
|
- isabelle components -u .
|
||||||
|
- isabelle build -D . -o browser_info
|
||||||
|
- isabelle dof_mkroot -q DOF_test
|
||||||
- isabelle build -D DOF_test
|
- isabelle build -D DOF_test
|
||||||
- cp -r $ISABELLE_HOME_USER/browser_info $ARTIFACT_DIR
|
- cp -r $ISABELLE_HOME_USER/browser_info $ARTIFACT_DIR
|
||||||
- cd $ARTIFACT_DIR
|
- cd $ARTIFACT_DIR
|
||||||
|
|
|
@ -194,8 +194,8 @@ for i in $VARS; do
|
||||||
export "$i"
|
export "$i"
|
||||||
done
|
done
|
||||||
|
|
||||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL options -g dof_isabelle)"
|
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
|
||||||
DOF_VERSION="$($ISABELLE_TOOL options -g dof_version)"
|
DOF_VERSION="$($ISABELLE_TOOL dof_param -b dof_version)"
|
||||||
|
|
||||||
ISABELLE_SHORT_VERSION=`echo $ISABELLE_VERSION | sed -e 's/:.*$//'`
|
ISABELLE_SHORT_VERSION=`echo $ISABELLE_VERSION | sed -e 's/:.*$//'`
|
||||||
ISADOF_TAR="Isabelle_DOF-"$DOF_VERSION"_"$ISABELLE_SHORT_VERSION
|
ISADOF_TAR="Isabelle_DOF-"$DOF_VERSION"_"$ISABELLE_SHORT_VERSION
|
||||||
|
|
18
README.md
18
README.md
|
@ -10,10 +10,10 @@ online.](https://artifacts.logicalhacking.com/releases/Isabelle_DOF/Isabelle_DOF
|
||||||
Isabelle/DOF has three major prerequisites:
|
Isabelle/DOF has three major prerequisites:
|
||||||
|
|
||||||
* **Isabelle:** Isabelle/DOF requires [Isabelle
|
* **Isabelle:** Isabelle/DOF requires [Isabelle
|
||||||
2022](http://isabelle.in.tum.de/website-Isabelle2022/). Please download the
|
2022](https://isabelle.in.tum.de/website-Isabelle2022/). Please download the
|
||||||
Isabelle 2022 distribution for your operating system from the [Isabelle
|
Isabelle 2022 distribution for your operating system from the [Isabelle
|
||||||
website](http://isabelle.in.tum.de/website-Isabelle2022/).
|
website](https://isabelle.in.tum.de/website-Isabelle2022/).
|
||||||
* **AFP:** Isabelle/DOF requires two entries from the [Archive of Formal Proofs
|
* **AFP:** Isabelle/DOF requires several entries from the [Archive of Formal Proofs
|
||||||
(AFP)](https://www.isa-afp.org/). Please install the AFP following the
|
(AFP)](https://www.isa-afp.org/). Please install the AFP following the
|
||||||
instructions given at <https://www.isa-afp.org/using.html>.
|
instructions given at <https://www.isa-afp.org/using.html>.
|
||||||
* **LaTeX:** Isabelle/DOF requires a modern LaTeX installation, i.e., at least
|
* **LaTeX:** Isabelle/DOF requires a modern LaTeX installation, i.e., at least
|
||||||
|
@ -29,7 +29,7 @@ this command already during the installation of the prerequisites, you can skip
|
||||||
it now):
|
it now):
|
||||||
|
|
||||||
```console
|
```console
|
||||||
foo@bar:~$ isabelle components -u `pwd`
|
foo@bar:~$ isabelle components -u .
|
||||||
```
|
```
|
||||||
|
|
||||||
The final step for the installation is:
|
The final step for the installation is:
|
||||||
|
@ -91,15 +91,19 @@ templates:
|
||||||
|
|
||||||
```console
|
```console
|
||||||
foo@bar:~$ isabelle dof_mkroot -h
|
foo@bar:~$ isabelle dof_mkroot -h
|
||||||
|
|
||||||
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
||||||
|
|
||||||
Options are:
|
Options are:
|
||||||
-I init Mercurial repository and add generated files
|
-I init Mercurial repository and add generated files
|
||||||
|
-h print help
|
||||||
-n NAME alternative session name (default: directory base name)
|
-n NAME alternative session name (default: directory base name)
|
||||||
-o ONTOLOGY ontology (default: scholarly_paper)
|
-o NAMES list of ontologies, separated by blanks
|
||||||
-t TEMPLATE tempalte (default: scrartcl)
|
(default: "technical_report scholarly_paper")
|
||||||
|
-q quiet mode: less verbosity
|
||||||
|
-t NAME template (default: "scrreprt-modern")
|
||||||
|
|
||||||
Prepare session root directory (default: current directory).
|
Create session root directory for Isabelle/DOF (default: current directory).
|
||||||
```
|
```
|
||||||
|
|
||||||
## Releases
|
## Releases
|
||||||
|
|
|
@ -4,6 +4,7 @@ no_build = false
|
||||||
requirements = \
|
requirements = \
|
||||||
env:ISABELLE_SCALA_JAR
|
env:ISABELLE_SCALA_JAR
|
||||||
sources = \
|
sources = \
|
||||||
|
src/scala/dof.scala \
|
||||||
src/scala/dof_document_build.scala \
|
src/scala/dof_document_build.scala \
|
||||||
src/scala/dof_mkroot.scala \
|
src/scala/dof_mkroot.scala \
|
||||||
src/scala/dof_tools.scala
|
src/scala/dof_tools.scala
|
||||||
|
|
30
etc/options
30
etc/options
|
@ -1,30 +0,0 @@
|
||||||
(* :mode=isabelle-options: *)
|
|
||||||
|
|
||||||
section "Isabelle/DOF"
|
|
||||||
|
|
||||||
public option dof_template : string = "scrreprt-modern"
|
|
||||||
-- "Default document template for Isabelle/DOF documents"
|
|
||||||
|
|
||||||
public option dof_ontologies : string = "Isabelle_DOF.technical_report Isabelle_DOF.scholarly_paper"
|
|
||||||
-- "Isabelle/DOF ontologies (separated by blanks)"
|
|
||||||
|
|
||||||
option dof_version : string = "Unreleased"
|
|
||||||
-- "Isabelle/DOF version"
|
|
||||||
(* "Unreleased" for development, semantic version for releases *)
|
|
||||||
|
|
||||||
option dof_isabelle : string = "2022"
|
|
||||||
option dof_afp : string = "afp-2022-10-27"
|
|
||||||
|
|
||||||
option dof_latest_version : string = "1.3.0"
|
|
||||||
option dof_latest_isabelle : string = "Isabelle2021-1"
|
|
||||||
option dof_latest_doi : string = "10.5281/zenodo.6810799"
|
|
||||||
option dof_generic_doi : string = "10.5281/zenodo.3370482"
|
|
||||||
|
|
||||||
option dof_url : string = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF"
|
|
||||||
-- "Isabelle/DOF source repository"
|
|
||||||
|
|
||||||
option dof_artifact_dir : string = "releases/Isabelle_DOF/Isabelle_DOF"
|
|
||||||
-- "Isabelle/DOF release artifacts"
|
|
||||||
|
|
||||||
option dof_artifact_host : string = "artifacts.logicalhacking.com"
|
|
||||||
|
|
|
@ -1,5 +1,4 @@
|
||||||
# -*- shell-script -*- :mode=shellscript:
|
# -*- shell-script -*- :mode=shellscript:
|
||||||
|
|
||||||
ISABELLE_DOF_HOME="$COMPONENT"
|
ISABELLE_DOF_HOME="$COMPONENT"
|
||||||
ISABELLE_DOCS="$ISABELLE_DOF_HOME/doc":"$ISABELLE_DOCS"
|
ISABELLE_DOCS="$ISABELLE_DOF_HOME/doc:$ISABELLE_DOCS"
|
||||||
|
|
||||||
|
|
|
@ -1,7 +1,5 @@
|
||||||
session "mini_odo" = "Isabelle_DOF" +
|
session "mini_odo" = "Isabelle_DOF" +
|
||||||
options [document = pdf, document_output = "output", document_build = dof,
|
options [document = pdf, document_output = "output", document_build = dof]
|
||||||
dof_ontologies = "Isabelle_DOF.technical_report Isabelle_DOF.cenelec_50128",
|
|
||||||
dof_template = "Isabelle_DOF.scrreprt-modern"]
|
|
||||||
sessions
|
sessions
|
||||||
"Physical_Quantities"
|
"Physical_Quantities"
|
||||||
theories
|
theories
|
||||||
|
|
|
@ -19,6 +19,8 @@ imports
|
||||||
"Isabelle_DOF.technical_report"
|
"Isabelle_DOF.technical_report"
|
||||||
"Physical_Quantities.SI" "Physical_Quantities.SI_Pretty"
|
"Physical_Quantities.SI" "Physical_Quantities.SI_Pretty"
|
||||||
begin
|
begin
|
||||||
|
use_template "scrreprt-modern"
|
||||||
|
use_ontology technical_report and CENELEC_50128
|
||||||
declare[[strict_monitor_checking=true]]
|
declare[[strict_monitor_checking=true]]
|
||||||
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
|
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
|
||||||
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
|
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
|
||||||
|
@ -294,7 +296,7 @@ and the global model parameters such as wheel diameter, the number of teeth per
|
||||||
sampling frequency etc., we can infer the maximal time of service as well the maximum distance
|
sampling frequency etc., we can infer the maximal time of service as well the maximum distance
|
||||||
the device can measure. As an example configuration, choosing:
|
the device can measure. As an example configuration, choosing:
|
||||||
|
|
||||||
\<^item> \<^term>\<open>(1 *\<^sub>Q metre)::real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
|
\<^item> \<^term>\<open>(1 *\<^sub>Q metre):: real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
|
||||||
\<^item> \<^term>\<open>100 :: real\<close> for \<^term>\<open>tpw\<close> (teeth per wheel),
|
\<^item> \<^term>\<open>100 :: real\<close> for \<^term>\<open>tpw\<close> (teeth per wheel),
|
||||||
\<^item> \<^term>\<open>80 *\<^sub>Q kmh :: real[m\<cdot>s\<^sup>-\<^sup>1]\<close> for \<^term>\<open>Speed\<^sub>M\<^sub>a\<^sub>x\<close>,
|
\<^item> \<^term>\<open>80 *\<^sub>Q kmh :: real[m\<cdot>s\<^sup>-\<^sup>1]\<close> for \<^term>\<open>Speed\<^sub>M\<^sub>a\<^sub>x\<close>,
|
||||||
\<^item> \<^term>\<open>14.4 *\<^sub>Q kHz :: real[s\<^sup>-\<^sup>1]\<close> for the sampling frequency,
|
\<^item> \<^term>\<open>14.4 *\<^sub>Q kHz :: real[s\<^sup>-\<^sup>1]\<close> for the sampling frequency,
|
||||||
|
@ -654,9 +656,9 @@ text*[t10::test_result]
|
||||||
test-execution via, \<^eg>, a makefile or specific calls to a test-environment or test-engine. \<close>
|
test-execution via, \<^eg>, a makefile or specific calls to a test-environment or test-engine. \<close>
|
||||||
|
|
||||||
|
|
||||||
text
|
text \<open> Finally some examples of references to doc-items, i.e. text-elements
|
||||||
\<open> Finally some examples of references to doc-items, i.e. text-elements
|
with declared meta-information and status. \<close>
|
||||||
with declared meta-information and status. \<close>
|
|
||||||
text \<open> As established by @{test_result (unchecked) \<open>t10\<close>},
|
text \<open> As established by @{test_result (unchecked) \<open>t10\<close>},
|
||||||
@{test_result (define) \<open>t10\<close>} \<close>
|
@{test_result (define) \<open>t10\<close>} \<close>
|
||||||
text \<open> the @{test_result \<open>t10\<close>}
|
text \<open> the @{test_result \<open>t10\<close>}
|
||||||
|
|
|
@ -16,6 +16,9 @@ theory IsaDofApplications
|
||||||
imports "Isabelle_DOF.scholarly_paper"
|
imports "Isabelle_DOF.scholarly_paper"
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
use_template "lncs"
|
||||||
|
use_ontology "scholarly_paper"
|
||||||
|
|
||||||
open_monitor*[this::article]
|
open_monitor*[this::article]
|
||||||
declare[[strict_monitor_checking=false]]
|
declare[[strict_monitor_checking=false]]
|
||||||
|
|
||||||
|
@ -71,7 +74,7 @@ abstract*[abs::abstract, keywordlist="[''Ontology'',''Ontological Modeling'',''I
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
section*[intro::introduction]\<open> Introduction \<close>
|
section*[intro::introduction]\<open> Introduction \<close>
|
||||||
text*[introtext::introduction]\<open>
|
text*[introtext::introduction, level = "Some 1"]\<open>
|
||||||
The linking of the \<^emph>\<open>formal\<close> to the \<^emph>\<open>informal\<close> is perhaps the
|
The linking of the \<^emph>\<open>formal\<close> to the \<^emph>\<open>informal\<close> is perhaps the
|
||||||
most pervasive challenge in the digitization of knowledge and its
|
most pervasive challenge in the digitization of knowledge and its
|
||||||
propagation. This challenge incites numerous research efforts
|
propagation. This challenge incites numerous research efforts
|
||||||
|
@ -120,7 +123,7 @@ declare_reference*[ontomod::text_section]
|
||||||
declare_reference*[ontopide::text_section]
|
declare_reference*[ontopide::text_section]
|
||||||
declare_reference*[conclusion::text_section]
|
declare_reference*[conclusion::text_section]
|
||||||
(*>*)
|
(*>*)
|
||||||
text*[plan::introduction]\<open> The plan of the paper is follows: we start by introducing the underlying
|
text*[plan::introduction, level="Some 1"]\<open> The plan of the paper is follows: we start by introducing the underlying
|
||||||
Isabelle system (@{text_section (unchecked) \<open>bgrnd\<close>}) followed by presenting the
|
Isabelle system (@{text_section (unchecked) \<open>bgrnd\<close>}) followed by presenting the
|
||||||
essentials of \<^isadof> and its ontology language (@{text_section (unchecked) \<open>isadof\<close>}).
|
essentials of \<^isadof> and its ontology language (@{text_section (unchecked) \<open>isadof\<close>}).
|
||||||
It follows @{text_section (unchecked) \<open>ontomod\<close>}, where we present three application
|
It follows @{text_section (unchecked) \<open>ontomod\<close>}, where we present three application
|
||||||
|
@ -130,7 +133,7 @@ conclusions and discuss related work in @{text_section (unchecked) \<open>conclu
|
||||||
|
|
||||||
section*[bgrnd::text_section,main_author="Some(@{docitem ''bu''}::author)"]
|
section*[bgrnd::text_section,main_author="Some(@{docitem ''bu''}::author)"]
|
||||||
\<open> Background: The Isabelle System \<close>
|
\<open> Background: The Isabelle System \<close>
|
||||||
text*[background::introduction]\<open>
|
text*[background::introduction, level="Some 1"]\<open>
|
||||||
While Isabelle is widely perceived as an interactive theorem prover
|
While Isabelle is widely perceived as an interactive theorem prover
|
||||||
for HOL (Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we
|
for HOL (Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we
|
||||||
would like to emphasize the view that Isabelle is far more than that:
|
would like to emphasize the view that Isabelle is far more than that:
|
||||||
|
@ -159,7 +162,7 @@ figure*[architecture::figure,relative_width="100",src="''figures/isabelle-archit
|
||||||
asynchronous communication between the Isabelle system and
|
asynchronous communication between the Isabelle system and
|
||||||
the IDE (right-hand side). \<close>
|
the IDE (right-hand side). \<close>
|
||||||
|
|
||||||
text*[blug::introduction]\<open> The Isabelle system architecture shown in @{figure \<open>architecture\<close>}
|
text*[blug::introduction, level="Some 1"]\<open> The Isabelle system architecture shown in @{figure \<open>architecture\<close>}
|
||||||
comes with many layers, with Standard ML (SML) at the bottom layer as implementation
|
comes with many layers, with Standard ML (SML) at the bottom layer as implementation
|
||||||
language. The architecture actually foresees a \<^emph>\<open>Nano-Kernel\<close> (our terminology) which
|
language. The architecture actually foresees a \<^emph>\<open>Nano-Kernel\<close> (our terminology) which
|
||||||
resides in the SML structure \<^ML_structure>\<open>Context\<close>. This structure provides a kind of container called
|
resides in the SML structure \<^ML_structure>\<open>Context\<close>. This structure provides a kind of container called
|
||||||
|
@ -191,7 +194,7 @@ For the antiquotation \inlineisar+\at{value "fac 5"}+ we assume the usual defin
|
||||||
\inlineisar+fac+ in HOL.
|
\inlineisar+fac+ in HOL.
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
text*[anti]\<open> Thus, antiquotations can refer to formal content, can be type-checked before being
|
text*[anti::introduction, level = "Some 1"]\<open> Thus, antiquotations can refer to formal content, can be type-checked before being
|
||||||
displayed and can be used for calculations before actually being typeset. When editing,
|
displayed and can be used for calculations before actually being typeset. When editing,
|
||||||
Isabelle's PIDE offers auto-completion and error-messages while typing the above
|
Isabelle's PIDE offers auto-completion and error-messages while typing the above
|
||||||
\<^emph>\<open>semi-formal\<close> content. \<close>
|
\<^emph>\<open>semi-formal\<close> content. \<close>
|
||||||
|
|
|
@ -1,7 +1,5 @@
|
||||||
session "2018-cicm-isabelle_dof-applications" = "Isabelle_DOF" +
|
session "2018-cicm-isabelle_dof-applications" = "Isabelle_DOF" +
|
||||||
options [document = pdf, document_output = "output", document_build = dof,
|
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
|
||||||
dof_ontologies = "Isabelle_DOF.scholarly_paper", dof_template = Isabelle_DOF.lncs,
|
|
||||||
quick_and_dirty = true]
|
|
||||||
theories
|
theories
|
||||||
IsaDofApplications
|
IsaDofApplications
|
||||||
document_files
|
document_files
|
||||||
|
|
|
@ -1,6 +1,5 @@
|
||||||
session "2020-iFM-csp" = "Isabelle_DOF" +
|
session "2020-iFM-csp" = "Isabelle_DOF" +
|
||||||
options [document = pdf, document_output = "output", document_build = dof,
|
options [document = pdf, document_output = "output", document_build = dof]
|
||||||
dof_ontologies = "Isabelle_DOF.scholarly_paper", dof_template = Isabelle_DOF.scrartcl]
|
|
||||||
theories
|
theories
|
||||||
"paper"
|
"paper"
|
||||||
document_files
|
document_files
|
||||||
|
|
|
@ -6870,7 +6870,7 @@ isbn="978-3-540-48509-4"
|
||||||
title = {{Isabelle's} Logic: {HOL}},
|
title = {{Isabelle's} Logic: {HOL}},
|
||||||
author = {Tobias Nipkow and Lawrence C. Paulson and Markus Wenzel},
|
author = {Tobias Nipkow and Lawrence C. Paulson and Markus Wenzel},
|
||||||
year = 2009,
|
year = 2009,
|
||||||
misc = {\url{http://isabelle.in.tum.de/library/HOL/}}
|
misc = {\url{https://isabelle.in.tum.de/library/HOL/}}
|
||||||
}
|
}
|
||||||
|
|
||||||
@InProceedings{ garson.ea:security:2008,
|
@InProceedings{ garson.ea:security:2008,
|
||||||
|
@ -11000,7 +11000,7 @@ isbn="978-1-4471-3182-3"
|
||||||
journal = {Archive of Formal Proofs},
|
journal = {Archive of Formal Proofs},
|
||||||
month = apr,
|
month = apr,
|
||||||
year = 2019,
|
year = 2019,
|
||||||
note = {\url{http://isa-afp.org/entries/HOL-CSP.html}},
|
note = {\url{https://isa-afp.org/entries/HOL-CSP.html}},
|
||||||
ISSN = {2150-914x},
|
ISSN = {2150-914x},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -3,6 +3,8 @@ theory "paper"
|
||||||
imports "Isabelle_DOF.scholarly_paper"
|
imports "Isabelle_DOF.scholarly_paper"
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
use_template "scrartcl"
|
||||||
|
use_ontology "scholarly_paper"
|
||||||
|
|
||||||
open_monitor*[this::article]
|
open_monitor*[this::article]
|
||||||
|
|
||||||
|
@ -50,7 +52,7 @@ abstract*[abs, keywordlist="[\<open>Shallow Embedding\<close>,\<open>Process-Alg
|
||||||
\<close>
|
\<close>
|
||||||
text\<open>\<close>
|
text\<open>\<close>
|
||||||
section*[introheader::introduction,main_author="Some(@{docitem ''bu''}::author)"]\<open> Introduction \<close>
|
section*[introheader::introduction,main_author="Some(@{docitem ''bu''}::author)"]\<open> Introduction \<close>
|
||||||
text*[introtext::introduction]\<open>
|
text*[introtext::introduction, level="Some 1"]\<open>
|
||||||
Communicating Sequential Processes (\<^csp>) is a language
|
Communicating Sequential Processes (\<^csp>) is a language
|
||||||
to specify and verify patterns of interaction of concurrent systems.
|
to specify and verify patterns of interaction of concurrent systems.
|
||||||
Together with CCS and LOTOS, it belongs to the family of \<^emph>\<open>process algebras\<close>.
|
Together with CCS and LOTOS, it belongs to the family of \<^emph>\<open>process algebras\<close>.
|
||||||
|
@ -152,7 +154,7 @@ processes \<open>Skip\<close> (successful termination) and \<open>Stop\<close> (
|
||||||
\<open>\<T>(Skip) = \<T>(Stop) = {[]}\<close>.
|
\<open>\<T>(Skip) = \<T>(Stop) = {[]}\<close>.
|
||||||
Note that the trace sets, representing all \<^emph>\<open>partial\<close> history, is in general prefix closed.\<close>
|
Note that the trace sets, representing all \<^emph>\<open>partial\<close> history, is in general prefix closed.\<close>
|
||||||
|
|
||||||
text*[ex1::math_example, status=semiformal] \<open>
|
text*[ex1::math_example, status=semiformal, level="Some 1"] \<open>
|
||||||
Let two processes be defined as follows:
|
Let two processes be defined as follows:
|
||||||
|
|
||||||
\<^enum> \<open>P\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<box> (b \<rightarrow> Stop)\<close>
|
\<^enum> \<open>P\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<box> (b \<rightarrow> Stop)\<close>
|
||||||
|
@ -352,7 +354,7 @@ Roscoe and Brooks @{cite "Roscoe1992AnAO"} finally proposed another ordering, ca
|
||||||
that completeness could at least be assured for read-operations. This more complex ordering
|
that completeness could at least be assured for read-operations. This more complex ordering
|
||||||
is based on the concept \<^emph>\<open>refusals after\<close> a trace \<open>s\<close> and defined by \<open>\<R> P s \<equiv> {X | (s, X) \<in> \<F> P}\<close>.\<close>
|
is based on the concept \<^emph>\<open>refusals after\<close> a trace \<open>s\<close> and defined by \<open>\<R> P s \<equiv> {X | (s, X) \<in> \<F> P}\<close>.\<close>
|
||||||
|
|
||||||
Definition*[process_ordering, short_name="''process ordering''"]\<open>
|
Definition*[process_ordering, level= "Some 2", short_name="''process ordering''"]\<open>
|
||||||
We define \<open>P \<sqsubseteq> Q \<equiv> \<psi>\<^sub>\<D> \<and> \<psi>\<^sub>\<R> \<and> \<psi>\<^sub>\<M> \<close>, where
|
We define \<open>P \<sqsubseteq> Q \<equiv> \<psi>\<^sub>\<D> \<and> \<psi>\<^sub>\<R> \<and> \<psi>\<^sub>\<M> \<close>, where
|
||||||
\<^enum> \<open>\<psi>\<^sub>\<D> = \<D> P \<supseteq> \<D> Q \<close>
|
\<^enum> \<open>\<psi>\<^sub>\<D> = \<D> P \<supseteq> \<D> Q \<close>
|
||||||
\<^enum> \<open>\<psi>\<^sub>\<R> = s \<notin> \<D> P \<Rightarrow> \<R> P s = \<R> Q s\<close>
|
\<^enum> \<open>\<psi>\<^sub>\<R> = s \<notin> \<D> P \<Rightarrow> \<R> P s = \<R> Q s\<close>
|
||||||
|
@ -528,10 +530,10 @@ To handle termination better, we added two new processes \<open>CHAOS\<^sub>S\<^
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
(*<*) (* a test ...*)
|
(*<*) (* a test ...*)
|
||||||
text*[X22 ::math_content ]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
text*[X22 ::math_content, level="Some 2" ]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||||
text*[X32::"definition", mcc=defn]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
text*[X32::"definition", level="Some 2", mcc=defn]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||||
Definition*[X42]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
Definition*[X42, level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||||
Definition*[X52::"definition"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
Definition*[X52::"definition", level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||||
|
|
||||||
text\<open> The \<open>RUN\<close>-process defined @{math_content X22} represents the process that accepts all
|
text\<open> The \<open>RUN\<close>-process defined @{math_content X22} represents the process that accepts all
|
||||||
events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in two variants shown in
|
events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in two variants shown in
|
||||||
|
@ -539,11 +541,11 @@ events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in
|
||||||
stops or accepts any offered event, whereas \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> can additionally terminate.\<close>
|
stops or accepts any offered event, whereas \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> can additionally terminate.\<close>
|
||||||
(*>*)
|
(*>*)
|
||||||
|
|
||||||
Definition*[X2]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
Definition*[X2, level="Some 2"]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||||
Definition*[X3]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
Definition*[X3, level="Some 2"]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||||
Definition*[X4]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close>\<close>
|
Definition*[X4, level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close>\<close>
|
||||||
Definition*[X5]\<open>\<open>DF A \<equiv> \<mu> X. (\<sqinter> x \<in> A \<rightarrow> X)\<close> \<close>
|
Definition*[X5, level="Some 2"]\<open>\<open>DF A \<equiv> \<mu> X. (\<sqinter> x \<in> A \<rightarrow> X)\<close> \<close>
|
||||||
Definition*[X6]\<open>\<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. ((\<sqinter> x \<in> A \<rightarrow> X) \<sqinter> SKIP)\<close> \<close>
|
Definition*[X6, level="Some 2"]\<open>\<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. ((\<sqinter> x \<in> A \<rightarrow> X) \<sqinter> SKIP)\<close> \<close>
|
||||||
|
|
||||||
text\<open>In the following, we denote \<open> \<R>\<P> = {DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P, DF, RUN, CHAOS, CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P}\<close>.
|
text\<open>In the following, we denote \<open> \<R>\<P> = {DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P, DF, RUN, CHAOS, CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P}\<close>.
|
||||||
All five reference processes are divergence-free.
|
All five reference processes are divergence-free.
|
||||||
|
@ -605,16 +607,16 @@ handled separately. One contribution of our work is establish their precise rela
|
||||||
the Failure/Divergence Semantics of \<^csp>.\<close>
|
the Failure/Divergence Semantics of \<^csp>.\<close>
|
||||||
|
|
||||||
(* bizarre: Definition* does not work for this single case *)
|
(* bizarre: Definition* does not work for this single case *)
|
||||||
text*[X10::"definition"]\<open> \<open>deadlock\<^sub>-free P \<equiv> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close> \<close>
|
text*[X10::"definition", level="Some 2"]\<open> \<open>deadlock\<^sub>-free P \<equiv> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close> \<close>
|
||||||
|
|
||||||
text\<open>\<^noindent> A process \<open>P\<close> is deadlock-free if and only if after any trace \<open>s\<close> without \<open>\<surd>\<close>, the union of \<open>\<surd>\<close>
|
text\<open>\<^noindent> A process \<open>P\<close> is deadlock-free if and only if after any trace \<open>s\<close> without \<open>\<surd>\<close>, the union of \<open>\<surd>\<close>
|
||||||
and all events of \<open>P\<close> can never be a refusal set associated to \<open>s\<close>, which means that \<open>P\<close> cannot
|
and all events of \<open>P\<close> can never be a refusal set associated to \<open>s\<close>, which means that \<open>P\<close> cannot
|
||||||
be deadlocked after any non-terminating trace.
|
be deadlocked after any non-terminating trace.
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
Theorem*[T1, short_name="\<open>DF definition captures deadlock-freeness\<close>"]
|
Theorem*[T1, short_name="\<open>DF definition captures deadlock-freeness\<close>", level="Some 2"]
|
||||||
\<open> \hfill \break \<open>deadlock_free P \<longleftrightarrow> (\<forall>s\<in>\<T> P. tickFree s \<longrightarrow> (s, {\<surd>}\<union>events_of P) \<notin> \<F> P)\<close> \<close>
|
\<open> \hfill \break \<open>deadlock_free P \<longleftrightarrow> (\<forall>s\<in>\<T> P. tickFree s \<longrightarrow> (s, {\<surd>}\<union>events_of P) \<notin> \<F> P)\<close> \<close>
|
||||||
Definition*[X11]\<open> \<open>livelock\<^sub>-free P \<equiv> \<D> P = {} \<close> \<close>
|
Definition*[X11, level="Some 2"]\<open> \<open>livelock\<^sub>-free P \<equiv> \<D> P = {} \<close> \<close>
|
||||||
|
|
||||||
text\<open> Recall that all five reference processes are livelock-free.
|
text\<open> Recall that all five reference processes are livelock-free.
|
||||||
We also have the following lemmas about the
|
We also have the following lemmas about the
|
||||||
|
@ -628,7 +630,7 @@ text\<open>
|
||||||
Finally, we proved the following theorem that confirms the relationship between the two vital
|
Finally, we proved the following theorem that confirms the relationship between the two vital
|
||||||
properties:
|
properties:
|
||||||
\<close>
|
\<close>
|
||||||
Theorem*[T2, short_name="''DF implies LF''"]
|
Theorem*[T2, short_name="''DF implies LF''", level="Some 2"]
|
||||||
\<open> \<open>deadlock_free P \<longrightarrow> livelock_free P\<close> \<close>
|
\<open> \<open>deadlock_free P \<longrightarrow> livelock_free P\<close> \<close>
|
||||||
|
|
||||||
text\<open>
|
text\<open>
|
||||||
|
@ -795,7 +797,7 @@ This normal form is closed under deterministic and communication operators.
|
||||||
The advantage of this format is that we can mimick the well-known product automata construction
|
The advantage of this format is that we can mimick the well-known product automata construction
|
||||||
for an arbitrary number of synchronized processes under normal form.
|
for an arbitrary number of synchronized processes under normal form.
|
||||||
We only show the case of the synchronous product of two processes: \<close>
|
We only show the case of the synchronous product of two processes: \<close>
|
||||||
text*[T3::"theorem", short_name="\<open>Product Construction\<close>"]\<open>
|
text*[T3::"theorem", short_name="\<open>Product Construction\<close>", level="Some 2"]\<open>
|
||||||
Parallel composition translates to normal form:
|
Parallel composition translates to normal form:
|
||||||
@{cartouche [display,indent=5]\<open>(P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>1,\<upsilon>\<^sub>1\<rbrakk> \<sigma>\<^sub>1) || (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>2,\<upsilon>\<^sub>2\<rbrakk> \<sigma>\<^sub>2) =
|
@{cartouche [display,indent=5]\<open>(P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>1,\<upsilon>\<^sub>1\<rbrakk> \<sigma>\<^sub>1) || (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>2,\<upsilon>\<^sub>2\<rbrakk> \<sigma>\<^sub>2) =
|
||||||
P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2). \<tau>\<^sub>1 \<sigma>\<^sub>1 \<inter> \<tau>\<^sub>2 \<sigma>\<^sub>2 , \<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2).\<lambda>e.(\<upsilon>\<^sub>1 \<sigma>\<^sub>1 e, \<upsilon>\<^sub>2 \<sigma>\<^sub>2 e)\<rbrakk> (\<sigma>\<^sub>1,\<sigma>\<^sub>2)\<close>}
|
P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2). \<tau>\<^sub>1 \<sigma>\<^sub>1 \<inter> \<tau>\<^sub>2 \<sigma>\<^sub>2 , \<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2).\<lambda>e.(\<upsilon>\<^sub>1 \<sigma>\<^sub>1 e, \<upsilon>\<^sub>2 \<sigma>\<^sub>2 e)\<rbrakk> (\<sigma>\<^sub>1,\<sigma>\<^sub>2)\<close>}
|
||||||
|
@ -815,7 +817,7 @@ states via the closure \<open>\<RR>\<close>, which is defined inductively over:
|
||||||
Thus, normalization leads to a new characterization of deadlock-freeness inspired
|
Thus, normalization leads to a new characterization of deadlock-freeness inspired
|
||||||
from automata theory. We formally proved the following theorem:\<close>
|
from automata theory. We formally proved the following theorem:\<close>
|
||||||
|
|
||||||
text*[T4::"theorem", short_name="\<open>DF vs. Reacheability\<close>"]
|
text*[T4::"theorem", short_name="\<open>DF vs. Reacheability\<close>", level="Some 2"]
|
||||||
\<open> If each reachable state \<open>s \<in> (\<RR> \<tau> \<upsilon>)\<close> has outgoing transitions,
|
\<open> If each reachable state \<open>s \<in> (\<RR> \<tau> \<upsilon>)\<close> has outgoing transitions,
|
||||||
the \<^csp> process is deadlock-free:
|
the \<^csp> process is deadlock-free:
|
||||||
@{cartouche [display,indent=10] \<open>\<forall>\<sigma> \<in> (\<RR> \<tau> \<upsilon> \<sigma>\<^sub>0). \<tau> \<sigma> \<noteq> {} \<Longrightarrow> deadlock_free (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>,\<upsilon>\<rbrakk> \<sigma>\<^sub>0)\<close>}
|
@{cartouche [display,indent=10] \<open>\<forall>\<sigma> \<in> (\<RR> \<tau> \<upsilon> \<sigma>\<^sub>0). \<tau> \<sigma> \<noteq> {} \<Longrightarrow> deadlock_free (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>,\<upsilon>\<rbrakk> \<sigma>\<^sub>0)\<close>}
|
||||||
|
|
|
@ -159,9 +159,6 @@ replaced by built-in document templates.\<close> for users are:
|
||||||
\<^LaTeX>-styles) need to be registered in this file. For details of Isabelle's build system, please
|
\<^LaTeX>-styles) need to be registered in this file. For details of Isabelle's build system, please
|
||||||
consult the Isabelle System Manual~@{cite "wenzel:system-manual:2020"}. In addition to the
|
consult the Isabelle System Manual~@{cite "wenzel:system-manual:2020"}. In addition to the
|
||||||
standard features of, this file also contains \<^isadof> specific configurations:
|
standard features of, this file also contains \<^isadof> specific configurations:
|
||||||
\<^item> \<^boxed_bash>\<open>dof_ontologies\<close> a list of (fully qualified) ontologies, separated by spaces, used
|
|
||||||
by the project.
|
|
||||||
\<^item> \<^boxed_bash>\<open>dof_template\<close> the (fully qualified) document template.
|
|
||||||
\<^item> \<^boxed_bash>\<open>document_build=dof\<close> needs to be present, to tell Isabelle, to use the
|
\<^item> \<^boxed_bash>\<open>document_build=dof\<close> needs to be present, to tell Isabelle, to use the
|
||||||
Isabelle/DOF backend for the document generation.
|
Isabelle/DOF backend for the document generation.
|
||||||
\<^item> The file \<^boxed_bash>\<open>preamble.tex\<close>\<^index>\<open>preamble.tex\<close>, which allows users to add additional
|
\<^item> The file \<^boxed_bash>\<open>preamble.tex\<close>\<^index>\<open>preamble.tex\<close>, which allows users to add additional
|
||||||
|
@ -186,12 +183,14 @@ Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
||||||
|
|
||||||
Options are:
|
Options are:
|
||||||
-I init Mercurial repository and add generated files
|
-I init Mercurial repository and add generated files
|
||||||
|
-h print help
|
||||||
-n NAME alternative session name (default: directory base name)
|
-n NAME alternative session name (default: directory base name)
|
||||||
-o ONTOLOGY ontology (default: scholarly_paper)
|
-o NAMES list of ontologies, separated by blanks
|
||||||
-t TEMPLATE tempalte (default: scrartcl)
|
(default: "technical_report scholarly_paper")
|
||||||
|
-q quiet mode: less verbosity
|
||||||
Prepare session root directory (default: current directory).\<close>}
|
-t NAME template (default: "scrreprt-modern")
|
||||||
|
|
||||||
|
Create session root directory for Isabelle/DOF (default: current directory).\<close>}
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
section*[writing_doc::technical]\<open>Writing Documents: General Remarks\<close>
|
section*[writing_doc::technical]\<open>Writing Documents: General Remarks\<close>
|
||||||
|
|
|
@ -337,8 +337,8 @@ is currently only available in the SML API's of the kernel.
|
||||||
( @@{command "text*"} '[' meta_args ']' '\<open>' formal_text '\<close>'
|
( @@{command "text*"} '[' meta_args ']' '\<open>' formal_text '\<close>'
|
||||||
| @@{command "ML*"} ('[' meta_args ']')? '\<open>' SML_code '\<close>'
|
| @@{command "ML*"} ('[' meta_args ']')? '\<open>' SML_code '\<close>'
|
||||||
| @@{command "term*"} ('[' meta_args ']')? '\<open>' HOL_term '\<close>'
|
| @@{command "term*"} ('[' meta_args ']')? '\<open>' HOL_term '\<close>'
|
||||||
| @@{command "value*"} \<newline> ('[' meta_args ']' ('[' evaluator ']')?)? '\<open>' HOL_term '\<close>'
|
| (@@{command "value*"}
|
||||||
| @@{command "assert*"} \<newline> ('[' meta_args ']' ('[' evaluator ']')?)? '\<open>' HOL_term '\<close>'
|
| @@{command "assert*"}) \<newline> ('[' meta_args ']' ('[' evaluator ']')?)? '\<open>' HOL_term '\<close>'
|
||||||
)
|
)
|
||||||
\<close>
|
\<close>
|
||||||
\<^rail>\<open>
|
\<^rail>\<open>
|
||||||
|
@ -1041,10 +1041,11 @@ text\<open>
|
||||||
Ontological classes as described so far are too liberal in many situations.
|
Ontological classes as described so far are too liberal in many situations.
|
||||||
There is a first high-level syntax implementation for class invariants.
|
There is a first high-level syntax implementation for class invariants.
|
||||||
These invariants can be checked when an instance of the class is defined.
|
These invariants can be checked when an instance of the class is defined.
|
||||||
To enable the checking of the invariants, the \<^boxed_theory_text>\<open>invariants_checking\<close>
|
To enable the strict checking of the invariants,
|
||||||
|
the \<^boxed_theory_text>\<open>invariants_strict_checking\<close>
|
||||||
theory attribute must be set:
|
theory attribute must be set:
|
||||||
@{boxed_theory_text [display]\<open>
|
@{boxed_theory_text [display]\<open>
|
||||||
declare[[invariants_checking = true]]\<close>}
|
declare[[invariants_strict_checking = true]]\<close>}
|
||||||
|
|
||||||
For example, let's define the following two classes:
|
For example, let's define the following two classes:
|
||||||
@{boxed_theory_text [display]\<open>
|
@{boxed_theory_text [display]\<open>
|
||||||
|
@ -1104,6 +1105,12 @@ text\<open>
|
||||||
All specified constraints are already checked in the IDE of \<^dof> while editing.
|
All specified constraints are already checked in the IDE of \<^dof> while editing.
|
||||||
The invariant \<^boxed_theory_text>\<open>author_finite\<close> enforces that the user sets the
|
The invariant \<^boxed_theory_text>\<open>author_finite\<close> enforces that the user sets the
|
||||||
\<^boxed_theory_text>\<open>authored_by\<close> set.
|
\<^boxed_theory_text>\<open>authored_by\<close> set.
|
||||||
|
The invariants \<^theory_text>\<open>author_finite\<close> and \<^theory_text>\<open>establish_defined\<close> can not be checked directly
|
||||||
|
and need a little help.
|
||||||
|
We can set the \<open>invariants_checking_with_tactics\<close> theory attribute to help the checking.
|
||||||
|
It will enable a basic tactic, using unfold and auto:
|
||||||
|
@{boxed_theory_text [display]\<open>
|
||||||
|
declare[[invariants_checking_with_tactics = true]]\<close>}
|
||||||
There are still some limitations with this high-level syntax.
|
There are still some limitations with this high-level syntax.
|
||||||
For now, the high-level syntax does not support monitors (see \<^technical>\<open>sec:monitors\<close>).
|
For now, the high-level syntax does not support monitors (see \<^technical>\<open>sec:monitors\<close>).
|
||||||
For example, one would like to delay a final error message till the
|
For example, one would like to delay a final error message till the
|
||||||
|
@ -1182,9 +1189,9 @@ text\<open>
|
||||||
fun check_result_inv oid {is_monitor:bool} ctxt =
|
fun check_result_inv oid {is_monitor:bool} ctxt =
|
||||||
let
|
let
|
||||||
val kind =
|
val kind =
|
||||||
AttributeAccess.compute_attr_access ctxt "evidence" oid <@>{here} <@>{here}
|
ISA_core.compute_attr_access ctxt "evidence" oid NONE <@>{here}
|
||||||
val prop =
|
val prop =
|
||||||
AttributeAccess.compute_attr_access ctxt "property" oid <@>{here} <@>{here}
|
ISA_core.compute_attr_access ctxt "property" oid NONE <@>{here}
|
||||||
val tS = HOLogic.dest_list prop
|
val tS = HOLogic.dest_list prop
|
||||||
in case kind of
|
in case kind of
|
||||||
<@>{term "proof"} => if not(null tS) then true
|
<@>{term "proof"} => if not(null tS) then true
|
||||||
|
@ -1233,7 +1240,8 @@ value*\<open>filter (\<lambda>\<sigma>. the (text_section.level \<sigma>) > 1)
|
||||||
|
|
||||||
section*[infrastructure::technical]\<open>Technical Infrastructure\<close>
|
section*[infrastructure::technical]\<open>Technical Infrastructure\<close>
|
||||||
|
|
||||||
text\<open>
|
text\<open>
|
||||||
|
%FIXME update story concerning "list"
|
||||||
The list of fully supported (\<^ie>, supporting both interactive ontological modeling and
|
The list of fully supported (\<^ie>, supporting both interactive ontological modeling and
|
||||||
document generation) ontologies and the list of supported document templates can be
|
document generation) ontologies and the list of supported document templates can be
|
||||||
obtained by calling \<^boxed_bash>\<open>isabelle dof_mkroot -h\<close> (see \<^technical>\<open>first_project\<close>).
|
obtained by calling \<^boxed_bash>\<open>isabelle dof_mkroot -h\<close> (see \<^technical>\<open>first_project\<close>).
|
||||||
|
@ -1320,6 +1328,7 @@ text\<open>
|
||||||
text\<open>
|
text\<open>
|
||||||
Developing a new document template ``\<^boxed_bash>\<open>bar\<close>'' requires the following steps:
|
Developing a new document template ``\<^boxed_bash>\<open>bar\<close>'' requires the following steps:
|
||||||
\<^item> develop a new \<^LaTeX>-template \<^boxed_bash>\<open>src/document-templates/root-bar.tex\<close>
|
\<^item> develop a new \<^LaTeX>-template \<^boxed_bash>\<open>src/document-templates/root-bar.tex\<close>
|
||||||
|
\<^item> add a suitable \<^theory_text>\<open>define_template\<close> command to theory \<^theory>\<open>Isabelle_DOF.Isa_DOF\<close>.
|
||||||
\<^item> activation of the new document template by executing the \<^boxed_bash>\<open>install\<close> script. You can skip the lengthy
|
\<^item> activation of the new document template by executing the \<^boxed_bash>\<open>install\<close> script. You can skip the lengthy
|
||||||
checks for the AFP entries and the installation of the Isabelle patch by using the
|
checks for the AFP entries and the installation of the Isabelle patch by using the
|
||||||
\<^boxed_bash>\<open>--skip-afp\<close> option:
|
\<^boxed_bash>\<open>--skip-afp\<close> option:
|
||||||
|
|
|
@ -53,7 +53,6 @@ text\<open>
|
||||||
\<open>structure Data = Generic_Data
|
\<open>structure Data = Generic_Data
|
||||||
( type T = docobj_tab * docclass_tab * ...
|
( type T = docobj_tab * docclass_tab * ...
|
||||||
val empty = (initial_docobj_tab, initial_docclass_tab, ...)
|
val empty = (initial_docobj_tab, initial_docclass_tab, ...)
|
||||||
val extend = I
|
|
||||||
fun merge((d1,c1,...),(d2,c2,...)) = (merge_docobj_tab (d1,d2,...),
|
fun merge((d1,c1,...),(d2,c2,...)) = (merge_docobj_tab (d1,d2,...),
|
||||||
merge_docclass_tab(c1,c2,...))
|
merge_docclass_tab(c1,c2,...))
|
||||||
);\<close>}
|
);\<close>}
|
||||||
|
|
|
@ -15,6 +15,8 @@
|
||||||
theory "Isabelle_DOF-Manual"
|
theory "Isabelle_DOF-Manual"
|
||||||
imports "05_Implementation"
|
imports "05_Implementation"
|
||||||
begin
|
begin
|
||||||
|
use_template "scrreprt-modern"
|
||||||
|
use_ontology "technical_report" and "CENELEC_50128"
|
||||||
close_monitor*[this]
|
close_monitor*[this]
|
||||||
check_doc_global
|
check_doc_global
|
||||||
text\<open>Resulting trace in \<^verbatim>\<open>doc_item\<close> ''this'': \<close>
|
text\<open>Resulting trace in \<^verbatim>\<open>doc_item\<close> ''this'': \<close>
|
||||||
|
|
|
@ -1,7 +1,5 @@
|
||||||
session "Isabelle_DOF-Manual" = "Isabelle_DOF" +
|
session "Isabelle_DOF-Manual" = "Isabelle_DOF" +
|
||||||
options [document = pdf, document_output = "output", document_build = dof,
|
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
|
||||||
dof_ontologies = "Isabelle_DOF.technical_report Isabelle_DOF.cenelec_50128", dof_template = "Isabelle_DOF.scrreprt-modern",
|
|
||||||
quick_and_dirty = true]
|
|
||||||
theories
|
theories
|
||||||
"Isabelle_DOF-Manual"
|
"Isabelle_DOF-Manual"
|
||||||
document_files
|
document_files
|
||||||
|
|
|
@ -451,7 +451,7 @@
|
||||||
journal = {Archive of Formal Proofs},
|
journal = {Archive of Formal Proofs},
|
||||||
month = may,
|
month = may,
|
||||||
year = 2010,
|
year = 2010,
|
||||||
note = {\url{http://isa-afp.org/entries/Regular-Sets.html}, Formal
|
note = {\url{https://isa-afp.org/entries/Regular-Sets.html}, Formal
|
||||||
proof development},
|
proof development},
|
||||||
issn = {2150-914x}
|
issn = {2150-914x}
|
||||||
}
|
}
|
||||||
|
@ -462,7 +462,7 @@
|
||||||
journal = {Archive of Formal Proofs},
|
journal = {Archive of Formal Proofs},
|
||||||
month = mar,
|
month = mar,
|
||||||
year = 2004,
|
year = 2004,
|
||||||
note = {\url{http://isa-afp.org/entries/Functional-Automata.html},
|
note = {\url{https://isa-afp.org/entries/Functional-Automata.html},
|
||||||
Formal proof development},
|
Formal proof development},
|
||||||
issn = {2150-914x}
|
issn = {2150-914x}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,7 +1,5 @@
|
||||||
session "TR_MyCommentedIsabelle" = "Isabelle_DOF" +
|
session "TR_MyCommentedIsabelle" = "Isabelle_DOF" +
|
||||||
options [document = pdf, document_output = "output", document_build = dof,
|
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
|
||||||
dof_ontologies = "Isabelle_DOF.technical_report", dof_template = Isabelle_DOF.scrreprt,
|
|
||||||
quick_and_dirty = true]
|
|
||||||
theories
|
theories
|
||||||
"TR_MyCommentedIsabelle"
|
"TR_MyCommentedIsabelle"
|
||||||
document_files
|
document_files
|
||||||
|
|
|
@ -14,9 +14,11 @@
|
||||||
(*<*)
|
(*<*)
|
||||||
theory TR_MyCommentedIsabelle
|
theory TR_MyCommentedIsabelle
|
||||||
imports "Isabelle_DOF.technical_report"
|
imports "Isabelle_DOF.technical_report"
|
||||||
|
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
use_template "scrreprt"
|
||||||
|
use_ontology "technical_report"
|
||||||
|
|
||||||
define_shortcut* isabelle \<rightleftharpoons> \<open>Isabelle/HOL\<close>
|
define_shortcut* isabelle \<rightleftharpoons> \<open>Isabelle/HOL\<close>
|
||||||
|
|
||||||
open_monitor*[this::report]
|
open_monitor*[this::report]
|
||||||
|
@ -62,7 +64,7 @@ text\<open> \<^vs>\<open>-0.5cm\<close>
|
||||||
Isabelle and Isabelle/HOL, a complementary text to the unfortunately somewhat outdated
|
Isabelle and Isabelle/HOL, a complementary text to the unfortunately somewhat outdated
|
||||||
"The Isabelle Cookbook" in \<^url>\<open>https://nms.kcl.ac.uk/christian.urban/Cookbook/\<close>.
|
"The Isabelle Cookbook" in \<^url>\<open>https://nms.kcl.ac.uk/christian.urban/Cookbook/\<close>.
|
||||||
The present text is also complementary to the current version of
|
The present text is also complementary to the current version of
|
||||||
\<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>
|
\<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>
|
||||||
"The Isabelle/Isar Implementation" by Makarius Wenzel in that it focusses on subjects
|
"The Isabelle/Isar Implementation" by Makarius Wenzel in that it focusses on subjects
|
||||||
not covered there, or presents alternative explanations for which I believe, based on my
|
not covered there, or presents alternative explanations for which I believe, based on my
|
||||||
experiences with students and Phds, that they are helpful.
|
experiences with students and Phds, that they are helpful.
|
||||||
|
@ -364,7 +366,7 @@ subsection*[t213::example]\<open>Mechanism 2 : Extending the Global Context \<op
|
||||||
|
|
||||||
text\<open>A central mechanism for constructing user-defined data is by the \<^ML_functor>\<open>Generic_Data\<close>-functor.
|
text\<open>A central mechanism for constructing user-defined data is by the \<^ML_functor>\<open>Generic_Data\<close>-functor.
|
||||||
A plugin needing some data \<^verbatim>\<open>T\<close> and providing it with implementations for an
|
A plugin needing some data \<^verbatim>\<open>T\<close> and providing it with implementations for an
|
||||||
\<^verbatim>\<open>empty\<close>, and operations \<^verbatim>\<open>merge\<close> and \<^verbatim>\<open>extend\<close>, can construct a lense with operations
|
\<^verbatim>\<open>empty\<close>, and operation \<^verbatim>\<open>merge\<close>, can construct a lense with operations
|
||||||
\<^verbatim>\<open>get\<close> and \<^verbatim>\<open>put\<close> that attach this data into the generic system context. Rather than using
|
\<^verbatim>\<open>get\<close> and \<^verbatim>\<open>put\<close> that attach this data into the generic system context. Rather than using
|
||||||
unsynchronized SML mutable variables, this is the mechanism to introduce component local
|
unsynchronized SML mutable variables, this is the mechanism to introduce component local
|
||||||
data in Isabelle, which allows to manage this data for the necessary backtrack and synchronization
|
data in Isabelle, which allows to manage this data for the necessary backtrack and synchronization
|
||||||
|
@ -373,14 +375,12 @@ text\<open>A central mechanism for constructing user-defined data is by the \<^M
|
||||||
ML \<open>
|
ML \<open>
|
||||||
datatype X = mt
|
datatype X = mt
|
||||||
val init = mt;
|
val init = mt;
|
||||||
val ext = I
|
|
||||||
fun merge (X,Y) = mt
|
fun merge (X,Y) = mt
|
||||||
|
|
||||||
structure Data = Generic_Data
|
structure Data = Generic_Data
|
||||||
(
|
(
|
||||||
type T = X
|
type T = X
|
||||||
val empty = init
|
val empty = init
|
||||||
val extend = ext
|
|
||||||
val merge = merge
|
val merge = merge
|
||||||
);
|
);
|
||||||
\<close>
|
\<close>
|
||||||
|
|
|
@ -151,19 +151,19 @@ fi
|
||||||
|
|
||||||
ACTUAL_ISABELLE_VERSION=`$ISABELLE_TOOL version`
|
ACTUAL_ISABELLE_VERSION=`$ISABELLE_TOOL version`
|
||||||
|
|
||||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL options -g dof_isabelle)"
|
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
|
||||||
|
|
||||||
if [ ${ISABELLE_VERSION} = "Isabelle" ];
|
if [ ${ISABELLE_VERSION} = "Isabelle" ];
|
||||||
then
|
then
|
||||||
echo "Error: cannot find Isabelle/DOF configuration, please check that you"
|
echo "Error: cannot find Isabelle/DOF configuration, please check that you"
|
||||||
echo " registered Isabelle/DOF as an Isabelle component, e.g., using"
|
echo " registered Isabelle/DOF as an Isabelle component, e.g., using"
|
||||||
echo " isabelle components -u $PWD"
|
echo " isabelle components -u ."
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
AFP_DATE="$($ISABELLE_TOOL options -g dof_afp)"
|
AFP_DATE="$($ISABELLE_TOOL dof_param -b afp_version)"
|
||||||
AFP_URL="https://www.isa-afp.org/release/"$AFP_DATE".tar.gz"
|
AFP_URL="https://www.isa-afp.org/release/"$AFP_DATE".tar.gz"
|
||||||
|
|
||||||
echo ""
|
echo ""
|
||||||
|
|
|
@ -125,14 +125,14 @@ fun heading_command (name, pos) descr level =
|
||||||
Monitor_Command_Parser.document_command (name, pos) descr
|
Monitor_Command_Parser.document_command (name, pos) descr
|
||||||
{markdown = false, body = true} (enriched_text_element_cmd level);
|
{markdown = false, body = true} (enriched_text_element_cmd level);
|
||||||
|
|
||||||
val _ = heading_command ("title*", @{here}) "section heading" NONE;
|
val _ = heading_command \<^command_keyword>\<open>title*\<close> "section heading" NONE;
|
||||||
val _ = heading_command ("subtitle*", @{here}) "section heading" NONE;
|
val _ = heading_command \<^command_keyword>\<open>subtitle*\<close> "section heading" NONE;
|
||||||
val _ = heading_command ("chapter*", @{here}) "section heading" (SOME (SOME 0));
|
val _ = heading_command \<^command_keyword>\<open>chapter*\<close> "section heading" (SOME (SOME 0));
|
||||||
val _ = heading_command ("section*", @{here}) "section heading" (SOME (SOME 1));
|
val _ = heading_command \<^command_keyword>\<open>section*\<close> "section heading" (SOME (SOME 1));
|
||||||
val _ = heading_command ("subsection*", @{here}) "subsection heading" (SOME (SOME 2));
|
val _ = heading_command \<^command_keyword>\<open>subsection*\<close> "subsection heading" (SOME (SOME 2));
|
||||||
val _ = heading_command ("subsubsection*", @{here}) "subsubsection heading" (SOME (SOME 3));
|
val _ = heading_command \<^command_keyword>\<open>subsubsection*\<close> "subsubsection heading" (SOME (SOME 3));
|
||||||
val _ = heading_command ("paragraph*", @{here}) "paragraph heading" (SOME (SOME 4));
|
val _ = heading_command \<^command_keyword>\<open>paragraph*\<close> "paragraph heading" (SOME (SOME 4));
|
||||||
val _ = heading_command ("subparagraph*", @{here}) "subparagraph heading" (SOME (SOME 5));
|
val _ = heading_command \<^command_keyword>\<open>subparagraph*\<close> "subparagraph heading" (SOME (SOME 5));
|
||||||
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
@ -157,6 +157,9 @@ doc_class figure =
|
||||||
placement :: placement
|
placement :: placement
|
||||||
spawn_columns :: bool <= True
|
spawn_columns :: bool <= True
|
||||||
|
|
||||||
|
doc_class figure2 = figure +
|
||||||
|
caption :: string
|
||||||
|
|
||||||
|
|
||||||
doc_class side_by_side_figure = figure +
|
doc_class side_by_side_figure = figure +
|
||||||
anchor :: "string"
|
anchor :: "string"
|
||||||
|
@ -235,8 +238,8 @@ ML\<open>
|
||||||
(* Ontological Macro Command Support *)
|
(* Ontological Macro Command Support *)
|
||||||
(* *********************************************************************** *)
|
(* *********************************************************************** *)
|
||||||
|
|
||||||
val _ = Onto_Macros.heading_command ("figure*", @{here}) "figure" NONE;
|
val _ = Onto_Macros.heading_command \<^command_keyword>\<open>figure*\<close> "figure" NONE;
|
||||||
val _ = Onto_Macros.heading_command ("side_by_side_figure*", @{here}) "multiple figures" NONE;
|
val _ = Onto_Macros.heading_command \<^command_keyword>\<open>side_by_side_figure*\<close> "multiple figures" NONE;
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
(*<*)
|
(*<*)
|
||||||
|
@ -251,6 +254,99 @@ setup\<open>\<close>
|
||||||
*)
|
*)
|
||||||
(*>*)
|
(*>*)
|
||||||
|
|
||||||
|
subsubsection\<open>Figure Content\<close>
|
||||||
|
text\<open>The intermediate development goal is to separate the ontological, top-level construct
|
||||||
|
\<open>figure*\<close>, which will remain a referentiable, ontological document unit, from the more versatile
|
||||||
|
\<^emph>\<open>import\<close> of a figure. The hope is that this opens the way for more orthogonality and
|
||||||
|
abstraction from the LaTeX engine.
|
||||||
|
\<close>
|
||||||
|
|
||||||
|
ML\<open>
|
||||||
|
|
||||||
|
type fig_content = {relative_width : int, (* percent of textwidth, default 100 *)
|
||||||
|
scale : int, (* percent, default 100 *)
|
||||||
|
caption : Input.source (* default empty *)}
|
||||||
|
|
||||||
|
val mt_fig_content = {relative_width = 100,
|
||||||
|
scale = 100,
|
||||||
|
caption = Input.empty }: fig_content
|
||||||
|
|
||||||
|
(* doof wie 100 m feldweg. *)
|
||||||
|
fun upd_relative_width key {relative_width,scale,caption } : fig_content =
|
||||||
|
{relative_width = key,scale = scale,caption = caption}: fig_content
|
||||||
|
|
||||||
|
fun upd_scale key {relative_width,scale,caption } : fig_content =
|
||||||
|
{relative_width = relative_width,scale = key,caption = caption}: fig_content
|
||||||
|
|
||||||
|
fun upd_caption key {relative_width,scale,caption} : fig_content =
|
||||||
|
{relative_width = relative_width,scale = scale,caption= key}: fig_content
|
||||||
|
|
||||||
|
|
||||||
|
val widthN = "width"
|
||||||
|
val scaleN = "scale"
|
||||||
|
val captionN = "caption";
|
||||||
|
|
||||||
|
fun fig_content_modes (ctxt, toks) =
|
||||||
|
let val (y, toks') = ((((Scan.optional
|
||||||
|
(Args.parens
|
||||||
|
(Parse.list1
|
||||||
|
( (Args.$$$ widthN |-- Args.$$$ "=" -- Parse.int
|
||||||
|
>> (fn (_, k) => upd_relative_width k))
|
||||||
|
|| (Args.$$$ scaleN |-- Args.$$$ "=" -- Parse.int
|
||||||
|
>> (fn (_, k) => upd_scale k))
|
||||||
|
|| (Args.$$$ captionN |-- Args.$$$ "=" -- Parse.document_source
|
||||||
|
>> (fn (_, k) => upd_caption k))
|
||||||
|
))) [K mt_fig_content])
|
||||||
|
: (fig_content -> fig_content) list parser)
|
||||||
|
>> (foldl1 (op #>)))
|
||||||
|
: (fig_content -> fig_content) parser)
|
||||||
|
(toks)
|
||||||
|
in (y, (ctxt, toks')) end
|
||||||
|
|
||||||
|
fun document_antiq (check: Proof.context -> Path.T option -> Input.source -> Path.T) =
|
||||||
|
Args.context -- Scan.lift Parse.path_input >> (fn (ctxt, source) =>
|
||||||
|
(check ctxt NONE source;
|
||||||
|
Latex.string (Latex.output_ascii_breakable "/" (Input.string_of source))
|
||||||
|
|> Latex.macro "isatt"));
|
||||||
|
|
||||||
|
|
||||||
|
fun fig_content_antiquotation name scan =
|
||||||
|
(Document_Output.antiquotation_raw_embedded name
|
||||||
|
(scan : ((fig_content -> fig_content) * Input.source) context_parser)
|
||||||
|
(fn ctxt =>
|
||||||
|
(fn (cfg_trans,file:Input.source) =>
|
||||||
|
let val {relative_width,scale,caption} = cfg_trans mt_fig_content
|
||||||
|
val _ = if relative_width < 0 orelse scale<0 then error("negative parameter.")
|
||||||
|
else ()
|
||||||
|
val wdth_s = if relative_width = 100 then ""
|
||||||
|
else "width="^Real.toString((Real.fromInt relative_width)
|
||||||
|
/ (Real.fromInt 100))^"\textwidth"
|
||||||
|
val scale_s= if scale = 100 then ""
|
||||||
|
else "scale="^Real.toString((Real.fromInt scale) / (Real.fromInt 100))
|
||||||
|
val arg = enclose "[" "]" (commas [wdth_s,scale_s])
|
||||||
|
val lab = Document_Output.output_document ctxt {markdown = false} caption
|
||||||
|
val path = Resources.check_file ctxt NONE file
|
||||||
|
val _ = writeln("file "^Path.file_name path)
|
||||||
|
(* ToDo: must be declared source of type png or jpeg or pdf, ... *)
|
||||||
|
in file
|
||||||
|
|> (Latex.string o Input.string_of)
|
||||||
|
|> (XML.enclose ("\\includegraphics"^arg^"{") "}")
|
||||||
|
|> (fn X => X @ Latex.macro "capture" lab)
|
||||||
|
end
|
||||||
|
)
|
||||||
|
));
|
||||||
|
|
||||||
|
val _ = fig_content_antiquotation
|
||||||
|
: binding
|
||||||
|
-> ((fig_content -> fig_content) * Input.source) context_parser
|
||||||
|
-> theory -> theory
|
||||||
|
|
||||||
|
val _ = Theory.setup
|
||||||
|
( fig_content_antiquotation \<^binding>\<open>fig_content\<close>
|
||||||
|
(fig_content_modes -- Scan.lift(Parse.path_input)))
|
||||||
|
|
||||||
|
\<close>
|
||||||
|
|
||||||
subsection\<open>Tables\<close>
|
subsection\<open>Tables\<close>
|
||||||
(* TODO ! ! ! *)
|
(* TODO ! ! ! *)
|
||||||
(* dito the future monitor: table - block *)
|
(* dito the future monitor: table - block *)
|
||||||
|
@ -260,7 +356,7 @@ text\<open>Tables are (sub) document-elements represented inside the documentati
|
||||||
language. The used technology is similar to the existing railroad-diagram support
|
language. The used technology is similar to the existing railroad-diagram support
|
||||||
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, Sec. 4.5).
|
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, Sec. 4.5).
|
||||||
|
|
||||||
However, tables are not directly based on the ideosynchrasies of Knuth-based language design ---
|
However, tables are not directly based on the idiosyncrasies of Knuth-based language design ---
|
||||||
|
|
||||||
However, tables come with a more abstract structure model than conventional typesetting in the
|
However, tables come with a more abstract structure model than conventional typesetting in the
|
||||||
LaTeX tradition. It is based of the following principles:
|
LaTeX tradition. It is based of the following principles:
|
||||||
|
@ -356,7 +452,7 @@ fun upd_cell_line_width num
|
||||||
cell_line_color = cell_line_color, cell_line_width = cell_line_width@[num] }
|
cell_line_color = cell_line_color, cell_line_width = cell_line_width@[num] }
|
||||||
: cell_config
|
: cell_config
|
||||||
|
|
||||||
|
(*global default configs *)
|
||||||
val (tab_cell_placing, tab_cell_placing_setup)
|
val (tab_cell_placing, tab_cell_placing_setup)
|
||||||
= Attrib.config_string \<^binding>\<open>tab_cell_placing\<close> (K "center");
|
= Attrib.config_string \<^binding>\<open>tab_cell_placing\<close> (K "center");
|
||||||
val (tab_cell_height, tab_cell_height_setup)
|
val (tab_cell_height, tab_cell_height_setup)
|
||||||
|
@ -387,6 +483,8 @@ val _ = Theory.setup( tab_cell_placing_setup
|
||||||
#> tab_cell_line_width_setup
|
#> tab_cell_line_width_setup
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
(*syntax for local tab specifier *)
|
||||||
val cell_placingN = "cell_placing"
|
val cell_placingN = "cell_placing"
|
||||||
val cell_heightN = "cell_height"
|
val cell_heightN = "cell_height"
|
||||||
val cell_widthN = "cell_width"
|
val cell_widthN = "cell_width"
|
||||||
|
@ -396,7 +494,8 @@ val cell_line_widthN = "cell_line_width"
|
||||||
|
|
||||||
val placing_scan = Args.$$$ "left" || Args.$$$ "center" || Args.$$$ "right"
|
val placing_scan = Args.$$$ "left" || Args.$$$ "center" || Args.$$$ "right"
|
||||||
|
|
||||||
val color_scan = Args.$$$ "none" || Args.$$$ "red" || Args.$$$ "green" || Args.$$$ "blue" || Args.$$$ "black"
|
val color_scan = Args.$$$ "none" || Args.$$$ "red" || Args.$$$ "green"
|
||||||
|
|| Args.$$$ "blue" || Args.$$$ "black"
|
||||||
|
|
||||||
(*
|
(*
|
||||||
|
|
||||||
|
@ -516,7 +615,7 @@ val _ = Theory.setup
|
||||||
end
|
end
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
|
text\<open> @{file "../ROOT"} \<close>
|
||||||
define_shortcut* clearpage \<rightleftharpoons> \<open>\clearpage{}\<close>
|
define_shortcut* clearpage \<rightleftharpoons> \<open>\clearpage{}\<close>
|
||||||
hf \<rightleftharpoons> \<open>\hfill\<close>
|
hf \<rightleftharpoons> \<open>\hfill\<close>
|
||||||
br \<rightleftharpoons> \<open>\break\<close>
|
br \<rightleftharpoons> \<open>\break\<close>
|
||||||
|
@ -526,6 +625,9 @@ declare[[tab_cell_placing="left",tab_cell_height="18.0cm"]]
|
||||||
|
|
||||||
section\<open>Tests\<close>
|
section\<open>Tests\<close>
|
||||||
(*<*)
|
(*<*)
|
||||||
|
|
||||||
|
text\<open> @{fig_content [display] (scale = 80, width=80, caption=\<open>this is \<^term>\<open>\<sigma>\<^sub>i+2\<close> \<dots>\<close>)
|
||||||
|
\<open>../../examples/scholarly_paper/2018-cicm-isabelle_dof-applications/document/figures/isabelle-architecture.pdf\<close>}\<close>
|
||||||
text\<open> @{table_inline [display] (cell_placing = center,cell_height =\<open>12.0cm\<close>,
|
text\<open> @{table_inline [display] (cell_placing = center,cell_height =\<open>12.0cm\<close>,
|
||||||
cell_height =\<open>13pt\<close>, cell_width = \<open>12.0cm\<close>,
|
cell_height =\<open>13pt\<close>, cell_width = \<open>12.0cm\<close>,
|
||||||
cell_bgnd_color=black,cell_line_color=red,cell_line_width=\<open>12.0cm\<close>)
|
cell_bgnd_color=black,cell_line_color=red,cell_line_width=\<open>12.0cm\<close>)
|
||||||
|
|
|
@ -43,6 +43,8 @@ theory Isa_DOF (* Isabelle Document Ontology Framework *)
|
||||||
and "text*" "text-macro*" :: document_body
|
and "text*" "text-macro*" :: document_body
|
||||||
and "term*" "value*" "assert*" :: document_body
|
and "term*" "value*" "assert*" :: document_body
|
||||||
|
|
||||||
|
and "use_template" "use_ontology" :: thy_decl
|
||||||
|
and "define_template" "define_ontology" :: thy_load
|
||||||
and "print_doc_classes" "print_doc_items"
|
and "print_doc_classes" "print_doc_items"
|
||||||
"print_doc_class_template" "check_doc_global" :: diag
|
"print_doc_class_template" "check_doc_global" :: diag
|
||||||
|
|
||||||
|
@ -158,7 +160,7 @@ struct
|
||||||
|
|
||||||
fun merge_docclass_tab (otab,otab') = Symtab.merge (op =) (otab,otab')
|
fun merge_docclass_tab (otab,otab') = Symtab.merge (op =) (otab,otab')
|
||||||
|
|
||||||
val tag_attr = (Binding.make("tag_attribute",@{here}), @{typ "int"},Mixfix.NoSyn)
|
val tag_attr = (\<^binding>\<open>tag_attribute\<close>, \<^Type>\<open>int\<close>, Mixfix.NoSyn)
|
||||||
(* Attribute hidden to the user and used internally by isabelle_DOF.
|
(* Attribute hidden to the user and used internally by isabelle_DOF.
|
||||||
For example, this allows to add a specific id to a class
|
For example, this allows to add a specific id to a class
|
||||||
to be able to reference the class internally.
|
to be able to reference the class internally.
|
||||||
|
@ -251,7 +253,6 @@ structure Data = Generic_Data
|
||||||
docclass_eager_inv_tab = initial_docclass_eager_inv_tab,
|
docclass_eager_inv_tab = initial_docclass_eager_inv_tab,
|
||||||
docclass_lazy_inv_tab = initial_docclass_lazy_inv_tab
|
docclass_lazy_inv_tab = initial_docclass_lazy_inv_tab
|
||||||
}
|
}
|
||||||
val extend = I
|
|
||||||
fun merge( {docobj_tab=d1,docclass_tab = c1,
|
fun merge( {docobj_tab=d1,docclass_tab = c1,
|
||||||
ISA_transformer_tab = e1, monitor_tab=m1,
|
ISA_transformer_tab = e1, monitor_tab=m1,
|
||||||
docclass_inv_tab = n1,
|
docclass_inv_tab = n1,
|
||||||
|
@ -363,7 +364,7 @@ fun is_subclass (ctxt) s t = is_subclass0(#docclass_tab(get_data ctxt)) s t
|
||||||
fun is_subclass_global thy s t = is_subclass0(#docclass_tab(get_data_global thy)) s t
|
fun is_subclass_global thy s t = is_subclass0(#docclass_tab(get_data_global thy)) s t
|
||||||
|
|
||||||
|
|
||||||
fun typ_to_cid (Type(s,[@{typ "unit"}])) = Long_Name.qualifier s
|
fun typ_to_cid (Type(s,[\<^Type>\<open>unit\<close>])) = Long_Name.qualifier s
|
||||||
|typ_to_cid (Type(_,[T])) = typ_to_cid T
|
|typ_to_cid (Type(_,[T])) = typ_to_cid T
|
||||||
|typ_to_cid _ = error("type is not an ontological type.")
|
|typ_to_cid _ = error("type is not an ontological type.")
|
||||||
|
|
||||||
|
@ -812,8 +813,8 @@ fun print_doc_class_tree ctxt P T =
|
||||||
val (strict_monitor_checking, strict_monitor_checking_setup)
|
val (strict_monitor_checking, strict_monitor_checking_setup)
|
||||||
= Attrib.config_bool \<^binding>\<open>strict_monitor_checking\<close> (K false);
|
= Attrib.config_bool \<^binding>\<open>strict_monitor_checking\<close> (K false);
|
||||||
|
|
||||||
val (invariants_checking, invariants_checking_setup)
|
val (invariants_strict_checking, invariants_strict_checking_setup)
|
||||||
= Attrib.config_bool \<^binding>\<open>invariants_checking\<close> (K false);
|
= Attrib.config_bool \<^binding>\<open>invariants_strict_checking\<close> (K false);
|
||||||
|
|
||||||
val (invariants_checking_with_tactics, invariants_checking_with_tactics_setup)
|
val (invariants_checking_with_tactics, invariants_checking_with_tactics_setup)
|
||||||
= Attrib.config_bool \<^binding>\<open>invariants_checking_with_tactics\<close> (K false);
|
= Attrib.config_bool \<^binding>\<open>invariants_checking_with_tactics\<close> (K false);
|
||||||
|
@ -824,7 +825,7 @@ end (* struct *)
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
setup\<open>DOF_core.strict_monitor_checking_setup
|
setup\<open>DOF_core.strict_monitor_checking_setup
|
||||||
#> DOF_core.invariants_checking_setup
|
#> DOF_core.invariants_strict_checking_setup
|
||||||
#> DOF_core.invariants_checking_with_tactics_setup\<close>
|
#> DOF_core.invariants_checking_with_tactics_setup\<close>
|
||||||
|
|
||||||
section\<open> Syntax for Term Annotation Antiquotations (TA)\<close>
|
section\<open> Syntax for Term Annotation Antiquotations (TA)\<close>
|
||||||
|
@ -839,8 +840,8 @@ versions might extend this feature substantially.\<close>
|
||||||
|
|
||||||
subsection\<open> Syntax \<close>
|
subsection\<open> Syntax \<close>
|
||||||
|
|
||||||
typedecl "doc_class"
|
datatype "doc_class" = mk string
|
||||||
|
|
||||||
\<comment> \<open>and others in the future : file, http, thy, ...\<close>
|
\<comment> \<open>and others in the future : file, http, thy, ...\<close>
|
||||||
|
|
||||||
datatype "typ" = ISA_typ string ("@{typ _}")
|
datatype "typ" = ISA_typ string ("@{typ _}")
|
||||||
|
@ -851,6 +852,7 @@ datatype "file" = ISA_file string ("@{file _}")
|
||||||
datatype "thy" = ISA_thy string ("@{thy _}")
|
datatype "thy" = ISA_thy string ("@{thy _}")
|
||||||
consts ISA_docitem :: "string \<Rightarrow> 'a" ("@{docitem _}")
|
consts ISA_docitem :: "string \<Rightarrow> 'a" ("@{docitem _}")
|
||||||
datatype "docitem_attr" = ISA_docitem_attr string string ("@{docitemattr (_) :: (_)}")
|
datatype "docitem_attr" = ISA_docitem_attr string string ("@{docitemattr (_) :: (_)}")
|
||||||
|
consts ISA_trace_attribute :: "string \<Rightarrow> (string * string) list" ("@{trace-attribute _}")
|
||||||
|
|
||||||
\<comment> \<open>Dynamic setup of inner syntax cartouche\<close>
|
\<comment> \<open>Dynamic setup of inner syntax cartouche\<close>
|
||||||
|
|
||||||
|
@ -946,6 +948,7 @@ structure ISA_core =
|
||||||
struct
|
struct
|
||||||
|
|
||||||
fun err msg pos = error (msg ^ Position.here pos);
|
fun err msg pos = error (msg ^ Position.here pos);
|
||||||
|
fun warn msg pos = warning (msg ^ Position.here pos);
|
||||||
|
|
||||||
fun check_path check_file ctxt dir (name, pos) =
|
fun check_path check_file ctxt dir (name, pos) =
|
||||||
let
|
let
|
||||||
|
@ -1035,7 +1038,7 @@ fun ML_isa_check_docitem thy (term, req_ty, pos) _ =
|
||||||
| SOME {pos=pos_decl,cid,id,...} =>
|
| SOME {pos=pos_decl,cid,id,...} =>
|
||||||
let val ctxt = (Proof_Context.init_global thy)
|
let val ctxt = (Proof_Context.init_global thy)
|
||||||
val req_class = case req_ty of
|
val req_class = case req_ty of
|
||||||
Type("fun",[_,T]) => DOF_core.typ_to_cid T
|
\<^Type>\<open>fun _ T\<close> => DOF_core.typ_to_cid T
|
||||||
| _ => error("can not infer type for: "^ name)
|
| _ => error("can not infer type for: "^ name)
|
||||||
in if cid <> DOF_core.default_cid
|
in if cid <> DOF_core.default_cid
|
||||||
andalso not(DOF_core.is_subclass ctxt cid req_class)
|
andalso not(DOF_core.is_subclass ctxt cid req_class)
|
||||||
|
@ -1046,6 +1049,14 @@ fun ML_isa_check_docitem thy (term, req_ty, pos) _ =
|
||||||
else err ("faulty reference to docitem: "^name) pos
|
else err ("faulty reference to docitem: "^name) pos
|
||||||
in ML_isa_check_generic check thy (term, pos) end
|
in ML_isa_check_generic check thy (term, pos) end
|
||||||
|
|
||||||
|
fun ML_isa_check_trace_attribute thy (term, _, pos) s =
|
||||||
|
let
|
||||||
|
fun check thy (name, _) =
|
||||||
|
case DOF_core.get_object_global name thy of
|
||||||
|
NONE => err ("No class instance: " ^ name) pos
|
||||||
|
| SOME(_) => ()
|
||||||
|
in ML_isa_check_generic check thy (term, pos) end
|
||||||
|
|
||||||
fun ML_isa_elaborate_generic (_:theory) isa_name ty term_option _ =
|
fun ML_isa_elaborate_generic (_:theory) isa_name ty term_option _ =
|
||||||
case term_option of
|
case term_option of
|
||||||
NONE => error("Wrong term option. You must use a defined term")
|
NONE => error("Wrong term option. You must use a defined term")
|
||||||
|
@ -1141,12 +1152,62 @@ fun declare_class_instances_annotation thy doc_class_name =
|
||||||
{check=check_identity, elaborate= elaborate_instances_list}) thy end)
|
{check=check_identity, elaborate= elaborate_instances_list}) thy end)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
fun symbex_attr_access0 ctxt proj_term term =
|
||||||
|
let
|
||||||
|
val [subterm'] = Type_Infer_Context.infer_types ctxt [proj_term $ term]
|
||||||
|
in Value_Command.value ctxt (subterm') end
|
||||||
|
|
||||||
|
fun compute_attr_access ctxt attr oid pos_option pos' = (* template *)
|
||||||
|
case DOF_core.get_value_global oid (Context.theory_of ctxt) of
|
||||||
|
SOME term => let val ctxt = (Proof_Context.init_global (Context.theory_of ctxt))
|
||||||
|
val SOME{cid,pos=pos_decl,id,...} = DOF_core.get_object_local oid ctxt
|
||||||
|
val docitem_markup = docref_markup false oid id pos_decl;
|
||||||
|
val _ = Context_Position.report ctxt pos' docitem_markup;
|
||||||
|
val (* (long_cid, attr_b,ty) = *)
|
||||||
|
{long_name, typ=ty, def_pos, ...} =
|
||||||
|
case DOF_core.get_attribute_info_local cid attr ctxt of
|
||||||
|
SOME f => f
|
||||||
|
| NONE => error("attribute undefined for reference: "
|
||||||
|
^ oid
|
||||||
|
^ Position.here
|
||||||
|
(the pos_option handle Option.Option =>
|
||||||
|
error("Attribute "
|
||||||
|
^ attr
|
||||||
|
^ " undefined for reference: "
|
||||||
|
^ oid ^ Position.here pos')))
|
||||||
|
val proj_term = Const(long_name,dummyT --> ty)
|
||||||
|
val _ = case pos_option of
|
||||||
|
NONE => ()
|
||||||
|
| SOME pos =>
|
||||||
|
let
|
||||||
|
val class_name = Long_Name.qualifier long_name
|
||||||
|
val SOME{id,...} = DOF_core.get_doc_class_local class_name ctxt
|
||||||
|
val class_markup = docclass_markup false class_name id def_pos
|
||||||
|
in Context_Position.report ctxt pos class_markup end
|
||||||
|
in symbex_attr_access0 ctxt proj_term term end
|
||||||
|
(*in Value_Command.value ctxt term end*)
|
||||||
|
| NONE => error("identifier not a docitem reference" ^ Position.here pos')
|
||||||
|
|
||||||
|
fun ML_isa_elaborate_trace_attribute (thy:theory) _ _ term_option pos =
|
||||||
|
case term_option of
|
||||||
|
NONE => err ("Malformed term annotation") pos
|
||||||
|
| SOME term =>
|
||||||
|
let
|
||||||
|
val oid = HOLogic.dest_string term
|
||||||
|
val traces = compute_attr_access (Context.Theory thy) "trace" oid NONE pos
|
||||||
|
fun conv (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
|
||||||
|
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
|
||||||
|
let val s' = DOF_core.read_cid (Proof_Context.init_global thy) (HOLogic.dest_string s)
|
||||||
|
in \<^Const>\<open>Pair \<^typ>\<open>string\<close> \<^typ>\<open>string\<close>\<close> $ HOLogic.mk_string s' $ S end
|
||||||
|
val traces' = map conv (HOLogic.dest_list traces)
|
||||||
|
in HOLogic.mk_list \<^Type>\<open>prod \<^typ>\<open>string\<close> \<^typ>\<open>string\<close>\<close> traces' end
|
||||||
|
|
||||||
(* utilities *)
|
(* utilities *)
|
||||||
|
|
||||||
fun property_list_dest ctxt X = (map (fn Const ("Isa_DOF.ISA_term", _) $ s => HOLogic.dest_string s
|
fun property_list_dest ctxt X =
|
||||||
|Const ("Isa_DOF.ISA_term_repr",_) $ s
|
map (fn \<^Const_>\<open>ISA_term for s\<close> => HOLogic.dest_string s
|
||||||
=> holstring_to_bstring ctxt (HOLogic.dest_string s))
|
|\<^Const_>\<open>ISA_term_repr for s\<close> => holstring_to_bstring ctxt (HOLogic.dest_string s))
|
||||||
(HOLogic.dest_list X))
|
(HOLogic.dest_list X)
|
||||||
|
|
||||||
end; (* struct *)
|
end; (* struct *)
|
||||||
|
|
||||||
|
@ -1167,7 +1228,8 @@ setup\<open>DOF_core.update_isa_global("Isa_DOF.file.file",
|
||||||
{check=ISA_core.ML_isa_check_file, elaborate=ISA_core.ML_isa_elaborate_generic}) \<close>
|
{check=ISA_core.ML_isa_check_file, elaborate=ISA_core.ML_isa_elaborate_generic}) \<close>
|
||||||
setup\<open>DOF_core.update_isa_global("Isa_DOF.docitem",
|
setup\<open>DOF_core.update_isa_global("Isa_DOF.docitem",
|
||||||
{check=ISA_core.ML_isa_check_docitem, elaborate=ISA_core.ML_isa_elaborate_generic}) \<close>
|
{check=ISA_core.ML_isa_check_docitem, elaborate=ISA_core.ML_isa_elaborate_generic}) \<close>
|
||||||
|
setup\<open>DOF_core.update_isa_global("Isa_DOF.trace_attribute",
|
||||||
|
{check=ISA_core.ML_isa_check_trace_attribute, elaborate=ISA_core.ML_isa_elaborate_trace_attribute}) \<close>
|
||||||
section\<open> Syntax for Annotated Documentation Commands (the '' View'' Part I) \<close>
|
section\<open> Syntax for Annotated Documentation Commands (the '' View'' Part I) \<close>
|
||||||
|
|
||||||
|
|
||||||
|
@ -1330,7 +1392,7 @@ structure Docitem_Parser =
|
||||||
struct
|
struct
|
||||||
|
|
||||||
fun cid_2_cidType cid_long thy =
|
fun cid_2_cidType cid_long thy =
|
||||||
if cid_long = DOF_core.default_cid then @{typ "unit"}
|
if cid_long = DOF_core.default_cid then \<^Type>\<open>unit\<close>
|
||||||
else let val t = #docclass_tab(DOF_core.get_data_global thy)
|
else let val t = #docclass_tab(DOF_core.get_data_global thy)
|
||||||
fun ty_name cid = cid^"."^ Long_Name.base_name cid ^ Record.extN
|
fun ty_name cid = cid^"."^ Long_Name.base_name cid ^ Record.extN
|
||||||
fun fathers cid_long = case Symtab.lookup t cid_long of
|
fun fathers cid_long = case Symtab.lookup t cid_long of
|
||||||
|
@ -1341,7 +1403,7 @@ fun cid_2_cidType cid_long thy =
|
||||||
| SOME ({inherits_from=NONE, ...}) => [cid_long]
|
| SOME ({inherits_from=NONE, ...}) => [cid_long]
|
||||||
| SOME ({inherits_from=SOME(_,father), ...}) =>
|
| SOME ({inherits_from=SOME(_,father), ...}) =>
|
||||||
cid_long :: (fathers father)
|
cid_long :: (fathers father)
|
||||||
in fold (fn x => fn y => Type(ty_name x,[y])) (fathers cid_long) @{typ "unit"}
|
in fold (fn x => fn y => Type(ty_name x,[y])) (fathers cid_long) \<^Type>\<open>unit\<close>
|
||||||
end
|
end
|
||||||
|
|
||||||
fun create_default_object thy class_name =
|
fun create_default_object thy class_name =
|
||||||
|
@ -1351,7 +1413,7 @@ fun create_default_object thy class_name =
|
||||||
fun attr_to_free (binding, typ, _) = Free (purified_class_name ^ "_"
|
fun attr_to_free (binding, typ, _) = Free (purified_class_name ^ "_"
|
||||||
^ (Binding.name_of binding)
|
^ (Binding.name_of binding)
|
||||||
^ "_Attribute_Not_Initialized", typ)
|
^ "_Attribute_Not_Initialized", typ)
|
||||||
val class_list' = DOF_core.get_attributes class_name thy
|
val class_list = DOF_core.get_attributes class_name thy
|
||||||
fun attrs_filter [] = []
|
fun attrs_filter [] = []
|
||||||
| attrs_filter (x::xs) =
|
| attrs_filter (x::xs) =
|
||||||
let val (cid, ys) = x
|
let val (cid, ys) = x
|
||||||
|
@ -1362,13 +1424,15 @@ fun create_default_object thy class_name =
|
||||||
then true
|
then true
|
||||||
else is_duplicated y xs end
|
else is_duplicated y xs end
|
||||||
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
|
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
|
||||||
val class_list'' = rev (attrs_filter (rev class_list'))
|
val class_list' = rev (attrs_filter (rev class_list))
|
||||||
fun add_tag_to_attrs_free tag_attr thy (cid, filtered_attr_list) =
|
val tag_attr = HOLogic.mk_number \<^Type>\<open>int\<close>
|
||||||
|
fun add_tag_to_attrs_free' tag_attr thy (cid, filtered_attr_list) =
|
||||||
if DOF_core.is_virtual cid thy
|
if DOF_core.is_virtual cid thy
|
||||||
then (attr_to_free tag_attr)::(map (attr_to_free) filtered_attr_list)
|
then (tag_attr (serial ()))::(map (attr_to_free) filtered_attr_list)
|
||||||
else (map (attr_to_free) filtered_attr_list)
|
else (map (attr_to_free) filtered_attr_list)
|
||||||
val class_list''' = flat (map (add_tag_to_attrs_free DOF_core.tag_attr thy) class_list'')
|
val class_list'' = flat (map (add_tag_to_attrs_free' tag_attr thy) class_list')
|
||||||
in list_comb (make_const, (attr_to_free DOF_core.tag_attr)::class_list''') end
|
in list_comb (make_const, (tag_attr (serial()))::class_list'') end
|
||||||
|
|
||||||
|
|
||||||
fun check_classref {is_monitor=is_monitor} (SOME(cid,pos')) thy =
|
fun check_classref {is_monitor=is_monitor} (SOME(cid,pos')) thy =
|
||||||
let
|
let
|
||||||
|
@ -1435,14 +1499,9 @@ fun calc_update_term {mk_elaboration=mk_elaboration} thy cid_long
|
||||||
val tyenv' = (map (fn (s,(t,u)) => ((s,t),u)) (Vartab.dest tyenv))
|
val tyenv' = (map (fn (s,(t,u)) => ((s,t),u)) (Vartab.dest tyenv))
|
||||||
val _ = if Long_Name.base_name lhs = lhs orelse ln = lhs then ()
|
val _ = if Long_Name.base_name lhs = lhs orelse ln = lhs then ()
|
||||||
else error("illegal notation for attribute of "^cid_long)
|
else error("illegal notation for attribute of "^cid_long)
|
||||||
fun join (ttt as @{typ "int"})
|
fun join (ttt as \<^Type>\<open>int\<close>) = \<^Const>\<open>Groups.plus ttt\<close>
|
||||||
= Const (@{const_name "Groups.plus"}, ttt --> ttt --> ttt)
|
|join (ttt as \<^Type>\<open>set _\<close>) = \<^Const>\<open>Lattices.sup ttt\<close>
|
||||||
|join (ttt as @{typ "string"})
|
|join \<^Type>\<open>list A\<close> = \<^Const>\<open>List.append A\<close>
|
||||||
= Const (@{const_name "List.append"}, ttt --> ttt --> ttt)
|
|
||||||
|join (ttt as Type(@{type_name "set"},_))
|
|
||||||
= Const (@{const_name "Lattices.sup"}, ttt --> ttt --> ttt)
|
|
||||||
|join (ttt as Type(@{type_name "list"},_))
|
|
||||||
= Const (@{const_name "List.append"}, ttt --> ttt --> ttt)
|
|
||||||
|join _ = error("implicit fusion operation not defined for attribute: "^ lhs)
|
|join _ = error("implicit fusion operation not defined for attribute: "^ lhs)
|
||||||
(* could be extended to bool, map, multisets, ... *)
|
(* could be extended to bool, map, multisets, ... *)
|
||||||
val rhs' = instantiate_term tyenv' (generalize_term rhs)
|
val rhs' = instantiate_term tyenv' (generalize_term rhs)
|
||||||
|
@ -1542,11 +1601,12 @@ fun check_invariants thy oid =
|
||||||
end
|
end
|
||||||
fun check_invariants' ((inv_name, pos), term) =
|
fun check_invariants' ((inv_name, pos), term) =
|
||||||
let val ctxt = Proof_Context.init_global thy
|
let val ctxt = Proof_Context.init_global thy
|
||||||
|
val trivial_true = \<^term>\<open>True\<close> |> HOLogic.mk_Trueprop |> Thm.cterm_of ctxt |> Thm.trivial
|
||||||
val evaluated_term = value ctxt term
|
val evaluated_term = value ctxt term
|
||||||
handle ERROR e =>
|
handle ERROR e =>
|
||||||
if (String.isSubstring "Wellsortedness error" e)
|
if (String.isSubstring "Wellsortedness error" e)
|
||||||
andalso (Config.get_global thy DOF_core.invariants_checking_with_tactics)
|
andalso (Config.get_global thy DOF_core.invariants_checking_with_tactics)
|
||||||
then (warning("Invariants checking uses proof tactics");
|
then (warning("Invariants checking uses proof tactics");
|
||||||
let val prop_term = HOLogic.mk_Trueprop term
|
let val prop_term = HOLogic.mk_Trueprop term
|
||||||
val thms = Proof_Context.get_thms ctxt (inv_name ^ def_suffixN)
|
val thms = Proof_Context.get_thms ctxt (inv_name ^ def_suffixN)
|
||||||
(* Get the make definition (def(1) of the record) *)
|
(* Get the make definition (def(1) of the record) *)
|
||||||
|
@ -1556,20 +1616,32 @@ fun check_invariants thy oid =
|
||||||
(K ((unfold_tac ctxt thms') THEN (auto_tac ctxt)))
|
(K ((unfold_tac ctxt thms') THEN (auto_tac ctxt)))
|
||||||
|> Thm.close_derivation \<^here>
|
|> Thm.close_derivation \<^here>
|
||||||
handle ERROR e =>
|
handle ERROR e =>
|
||||||
ISA_core.err ("Invariant "
|
let
|
||||||
|
val msg_intro = "Invariant "
|
||||||
^ inv_name
|
^ inv_name
|
||||||
^ " failed to be checked using proof tactics"
|
^ " failed to be checked using proof tactics"
|
||||||
^ " with error: "
|
^ " with error:\n"
|
||||||
^ e) pos
|
in
|
||||||
|
if Config.get_global thy DOF_core.invariants_strict_checking
|
||||||
|
then ISA_core.err (msg_intro ^ e) pos
|
||||||
|
else (ISA_core.warn (msg_intro ^ e) pos; trivial_true) end
|
||||||
(* If Goal.prove does not fail, then the evaluation is considered True,
|
(* If Goal.prove does not fail, then the evaluation is considered True,
|
||||||
else an error is triggered by Goal.prove *)
|
else an error is triggered by Goal.prove *)
|
||||||
in @{term True} end)
|
in @{term True} end)
|
||||||
else ISA_core.err ("Invariant " ^ inv_name ^ " violated." ) pos
|
else
|
||||||
|
let val msg_intro = "Fail to check invariant "
|
||||||
|
^ inv_name
|
||||||
|
^ ". Try to activate invariants_checking_with_tactics\n"
|
||||||
|
in if Config.get_global thy DOF_core.invariants_strict_checking
|
||||||
|
then ISA_core.err (msg_intro ^ e) pos
|
||||||
|
else (ISA_core.warn (msg_intro ^ e) pos; term) end
|
||||||
in (if evaluated_term = \<^term>\<open>True\<close>
|
in (if evaluated_term = \<^term>\<open>True\<close>
|
||||||
then ((inv_name, pos), term)
|
then ((inv_name, pos), term)
|
||||||
else ISA_core.err ("Invariant " ^ inv_name ^ " violated") pos)
|
else if Config.get_global thy DOF_core.invariants_strict_checking
|
||||||
|
then ISA_core.err ("Invariant " ^ inv_name ^ " violated") pos
|
||||||
|
else ((inv_name, pos), term))
|
||||||
end
|
end
|
||||||
val _ = map check_invariants' inv_and_apply_list
|
val _ = map check_invariants' inv_and_apply_list
|
||||||
in thy end
|
in thy end
|
||||||
|
|
||||||
fun create_and_check_docitem is_monitor {is_inline=is_inline} oid pos cid_pos doc_attrs thy =
|
fun create_and_check_docitem is_monitor {is_inline=is_inline} oid pos cid_pos doc_attrs thy =
|
||||||
|
@ -1579,20 +1651,19 @@ fun create_and_check_docitem is_monitor {is_inline=is_inline} oid pos cid_pos do
|
||||||
(* creates a markup label for this position and reports it to the PIDE framework;
|
(* creates a markup label for this position and reports it to the PIDE framework;
|
||||||
this label is used as jump-target for point-and-click feature. *)
|
this label is used as jump-target for point-and-click feature. *)
|
||||||
val cid_long = check_classref is_monitor cid_pos thy
|
val cid_long = check_classref is_monitor cid_pos thy
|
||||||
|
val default_cid = cid_long = DOF_core.default_cid
|
||||||
val vcid = case cid_pos of NONE => NONE
|
val vcid = case cid_pos of NONE => NONE
|
||||||
| SOME (cid,_) => if (DOF_core.is_virtual cid thy)
|
| SOME (cid,_) => if (DOF_core.is_virtual cid thy)
|
||||||
then SOME (DOF_core.parse_cid_global thy cid)
|
then SOME (DOF_core.parse_cid_global thy cid)
|
||||||
else NONE
|
else NONE
|
||||||
val value_terms = if (cid_long = DOF_core.default_cid)
|
val value_terms = if default_cid
|
||||||
then let
|
then let
|
||||||
val undefined_value = Free ("Undefined_Value", @{typ "unit"})
|
val undefined_value = Free ("Undefined_Value", \<^Type>\<open>unit\<close>)
|
||||||
in (undefined_value, undefined_value) end
|
in (undefined_value, undefined_value) end
|
||||||
(*
|
(* Handle initialization of docitem without a class associated,
|
||||||
Handle initialization of docitem without a class associated,
|
for example when you just want a document element to be referenceable
|
||||||
for example when you just want a document element to be referenceable
|
without using the burden of ontology classes.
|
||||||
without using the burden of ontology classes.
|
ex: text*[sdf]\<open> Lorem ipsum @{thm refl}\<close> *)
|
||||||
ex: text*[sdf]\<open> Lorem ipsum @{thm refl}\<close>
|
|
||||||
*)
|
|
||||||
else let
|
else let
|
||||||
val defaults_init = create_default_object thy cid_long
|
val defaults_init = create_default_object thy cid_long
|
||||||
fun conv (na, _(*ty*), term) =(Binding.name_of na, Binding.pos_of na, "=", term);
|
fun conv (na, _(*ty*), term) =(Binding.name_of na, Binding.pos_of na, "=", term);
|
||||||
|
@ -1624,9 +1695,15 @@ fun create_and_check_docitem is_monitor {is_inline=is_inline} oid pos cid_pos do
|
||||||
o Context.Theory) thy; thy)
|
o Context.Theory) thy; thy)
|
||||||
else thy)
|
else thy)
|
||||||
|> (fn thy => (check_inv thy; thy))
|
|> (fn thy => (check_inv thy; thy))
|
||||||
|> (fn thy => if Config.get_global thy DOF_core.invariants_checking = true
|
(* Bypass checking of high-level invariants when the class default_cid = "text",
|
||||||
then check_invariants thy oid
|
the top (default) document class.
|
||||||
else thy)
|
We want the class default_cid to stay abstract
|
||||||
|
and not have the capability to be defined with attribute, invariants, etc.
|
||||||
|
Hence this bypass handles docitem without a class associated,
|
||||||
|
for example when you just want a document element to be referenceable
|
||||||
|
without using the burden of ontology classes.
|
||||||
|
ex: text*[sdf]\<open> Lorem ipsum @{thm refl}\<close> *)
|
||||||
|
|> (fn thy => if default_cid then thy else check_invariants thy oid)
|
||||||
end
|
end
|
||||||
|
|
||||||
end (* structure Docitem_Parser *)
|
end (* structure Docitem_Parser *)
|
||||||
|
@ -1810,9 +1887,7 @@ fun update_instance_command (((oid:string,pos),cid_pos),
|
||||||
in
|
in
|
||||||
thy |> DOF_core.update_value_global oid def_trans_input_term def_trans_value
|
thy |> DOF_core.update_value_global oid def_trans_input_term def_trans_value
|
||||||
|> check_inv
|
|> check_inv
|
||||||
|> (fn thy => if Config.get_global thy DOF_core.invariants_checking = true
|
|> (fn thy => Value_Command.Docitem_Parser.check_invariants thy oid)
|
||||||
then Value_Command.Docitem_Parser.check_invariants thy oid
|
|
||||||
else thy)
|
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|
||||||
|
@ -1872,9 +1947,7 @@ fun close_monitor_command (args as (((oid:string,pos),cid_pos),
|
||||||
in thy |> (fn thy => (check_lazy_inv thy; thy))
|
in thy |> (fn thy => (check_lazy_inv thy; thy))
|
||||||
|> update_instance_command args
|
|> update_instance_command args
|
||||||
|> (fn thy => (check_inv thy; thy))
|
|> (fn thy => (check_inv thy; thy))
|
||||||
|> (fn thy => if Config.get_global thy DOF_core.invariants_checking = true
|
|> (fn thy => Value_Command.Docitem_Parser.check_invariants thy oid)
|
||||||
then Value_Command.Docitem_Parser.check_invariants thy oid
|
|
||||||
else thy)
|
|
||||||
|> delete_monitor_entry
|
|> delete_monitor_entry
|
||||||
end
|
end
|
||||||
|
|
||||||
|
@ -1892,18 +1965,16 @@ fun meta_args_2_latex thy ((((lab, _), cid_opt), attr_list) : ODL_Meta_Args_Pars
|
||||||
(* val _ = writeln("meta_args_2_string cid_long:"^ cid_long ) *)
|
(* val _ = writeln("meta_args_2_string cid_long:"^ cid_long ) *)
|
||||||
val cid_txt = "type = " ^ (enclose "{" "}" cid_long);
|
val cid_txt = "type = " ^ (enclose "{" "}" cid_long);
|
||||||
|
|
||||||
fun ltx_of_term _ _ (Const ("List.list.Cons", @{typ "char \<Rightarrow> char list \<Rightarrow> char list"}) $ t1 $ t2)
|
fun ltx_of_term _ _ (c as \<^Const_>\<open>Cons \<^Type>\<open>char\<close> for _ _\<close>) = HOLogic.dest_string c
|
||||||
= HOLogic.dest_string (Const ("List.list.Cons", @{typ "char \<Rightarrow> char list \<Rightarrow> char list"}) $ t1 $ t2)
|
| ltx_of_term _ _ \<^Const_>\<open>Nil _\<close> = ""
|
||||||
| ltx_of_term _ _ (Const ("List.list.Nil", _)) = ""
|
| ltx_of_term _ _ \<^Const_>\<open>numeral _ for t\<close> = Int.toString(HOLogic.dest_numeral t)
|
||||||
| ltx_of_term _ _ (@{term "numeral :: _ \<Rightarrow> _"} $ t) = Int.toString(HOLogic.dest_numeral t)
|
| ltx_of_term ctx encl \<^Const_>\<open>Cons _ for t1 t2\<close> =
|
||||||
| ltx_of_term ctx encl ((Const ("List.list.Cons", _) $ t1) $ t2) =
|
let val inner = (case t2 of
|
||||||
let val inner = (case t2 of
|
\<^Const_>\<open>Nil _\<close> => ltx_of_term ctx true t1
|
||||||
Const ("List.list.Nil", _) => (ltx_of_term ctx true t1)
|
| _ => ((ltx_of_term ctx false t1)^", " ^(ltx_of_term ctx false t2)))
|
||||||
| _ => ((ltx_of_term ctx false t1)^", " ^(ltx_of_term ctx false t2))
|
in if encl then enclose "{" "}" inner else inner end
|
||||||
)
|
| ltx_of_term _ _ \<^Const_>\<open>None _\<close> = ""
|
||||||
in if encl then enclose "{" "}" inner else inner end
|
| ltx_of_term ctxt _ \<^Const_>\<open>Some _ for t\<close> = ltx_of_term ctxt true t
|
||||||
| ltx_of_term _ _ (Const ("Option.option.None", _)) = ""
|
|
||||||
| ltx_of_term ctxt _ (Const ("Option.option.Some", _)$t) = ltx_of_term ctxt true t
|
|
||||||
| ltx_of_term ctxt _ t = ""^(Sledgehammer_Util.hackish_string_of_term ctxt t)
|
| ltx_of_term ctxt _ t = ""^(Sledgehammer_Util.hackish_string_of_term ctxt t)
|
||||||
|
|
||||||
|
|
||||||
|
@ -1965,23 +2036,6 @@ fun document_output {markdown: bool, markup: Latex.text -> Latex.text} meta_args
|
||||||
val output_text = Document_Output.output_document ctxt {markdown = markdown} text;
|
val output_text = Document_Output.output_document ctxt {markdown = markdown} text;
|
||||||
in markup (output_meta @ output_text) end;
|
in markup (output_meta @ output_text) end;
|
||||||
|
|
||||||
val output_name = (* was available as Latex.output_name in Isabelle 2021-1 and earlier *)
|
|
||||||
translate_string
|
|
||||||
(fn "_" => "UNDERSCORE"
|
|
||||||
| "'" => "PRIME"
|
|
||||||
| "0" => "ZERO"
|
|
||||||
| "1" => "ONE"
|
|
||||||
| "2" => "TWO"
|
|
||||||
| "3" => "THREE"
|
|
||||||
| "4" => "FOUR"
|
|
||||||
| "5" => "FIVE"
|
|
||||||
| "6" => "SIX"
|
|
||||||
| "7" => "SEVEN"
|
|
||||||
| "8" => "EIGHT"
|
|
||||||
| "9" => "NINE"
|
|
||||||
| s => s);
|
|
||||||
|
|
||||||
|
|
||||||
fun document_output_reports name {markdown, body} meta_args text ctxt =
|
fun document_output_reports name {markdown, body} meta_args text ctxt =
|
||||||
let
|
let
|
||||||
val pos = Input.pos_of text;
|
val pos = Input.pos_of text;
|
||||||
|
@ -1991,145 +2045,53 @@ fun document_output_reports name {markdown, body} meta_args text ctxt =
|
||||||
(pos, Markup.plain_text)];
|
(pos, Markup.plain_text)];
|
||||||
fun markup xml =
|
fun markup xml =
|
||||||
let val m = if body then Markup.latex_body else Markup.latex_heading
|
let val m = if body then Markup.latex_body else Markup.latex_heading
|
||||||
in [XML.Elem (m (output_name name), xml)] end;
|
in [XML.Elem (m (Latex.output_name name), xml)] end;
|
||||||
in document_output {markdown = markdown, markup = markup} meta_args text ctxt end;
|
in document_output {markdown = markdown, markup = markup} meta_args text ctxt end;
|
||||||
|
|
||||||
|
|
||||||
(* document output commands *)
|
(* document output commands *)
|
||||||
|
|
||||||
local
|
|
||||||
|
|
||||||
(* alternative presentation hook (workaround for missing Toplevel.present_theory) *)
|
|
||||||
|
|
||||||
structure Document_Commands = Theory_Data
|
|
||||||
(
|
|
||||||
type T = (string * (ODL_Meta_Args_Parser.meta_args_t
|
|
||||||
-> Input.source -> Proof.context -> Latex.text)) list;
|
|
||||||
val empty = [];
|
|
||||||
val merge = AList.merge (op =) (K true);
|
|
||||||
);
|
|
||||||
|
|
||||||
fun get_document_command thy name =
|
|
||||||
AList.lookup (op =) (Document_Commands.get thy) name;
|
|
||||||
|
|
||||||
fun document_segment (segment: Document_Output.segment) =
|
|
||||||
(case #span segment of
|
|
||||||
Command_Span.Span (Command_Span.Command_Span (name, _), _) =>
|
|
||||||
(case try Toplevel.theory_of (#state segment) of
|
|
||||||
SOME thy => get_document_command thy name
|
|
||||||
| _ => NONE)
|
|
||||||
| _ => NONE);
|
|
||||||
|
|
||||||
fun present_segment (segment: Document_Output.segment) =
|
|
||||||
(case document_segment segment of
|
|
||||||
SOME pr =>
|
|
||||||
let
|
|
||||||
val {span, command = tr, prev_state = st, state = st'} = segment;
|
|
||||||
val src = Command_Span.content (#span segment) |> filter_out Document_Source.is_improper;
|
|
||||||
val parse = ODL_Meta_Args_Parser.attributes -- Parse.document_source;
|
|
||||||
fun present ctxt =
|
|
||||||
let val (meta_args, text) = #1 (Token.syntax (Scan.lift parse) src ctxt);
|
|
||||||
in pr meta_args text ctxt end;
|
|
||||||
val tr' =
|
|
||||||
Toplevel.empty
|
|
||||||
|> Toplevel.name (Toplevel.name_of tr)
|
|
||||||
|> Toplevel.position (Toplevel.pos_of tr)
|
|
||||||
|> Toplevel.present (Toplevel.presentation_context #> present);
|
|
||||||
val st'' = Toplevel.command_exception false tr' st'
|
|
||||||
handle Runtime.EXCURSION_FAIL (exn, _) => Exn.reraise exn;
|
|
||||||
val FIXME =
|
|
||||||
Toplevel.setmp_thread_position tr (fn () =>
|
|
||||||
writeln ("present_segment" ^ Position.here (Toplevel.pos_of tr) ^ "\n" ^
|
|
||||||
XML.string_of (XML.Elem (Markup.empty, the_default [] (Toplevel.output_of st'))) ^ "\n---\n" ^
|
|
||||||
XML.string_of (XML.Elem (Markup.empty, the_default [] (Toplevel.output_of st''))))) ()
|
|
||||||
in {span = span, command = tr, prev_state = st, state = st''} end
|
|
||||||
| _ => segment);
|
|
||||||
|
|
||||||
val _ =
|
|
||||||
Theory.setup (Thy_Info.add_presentation (fn {options, segments, ...} => fn thy =>
|
|
||||||
if exists (Toplevel.is_skipped_proof o #state) segments then ()
|
|
||||||
else
|
|
||||||
let
|
|
||||||
val segments' = map present_segment segments;
|
|
||||||
val body = Document_Output.present_thy options thy segments';
|
|
||||||
in
|
|
||||||
if Options.string options "document" = "false" orelse
|
|
||||||
forall (is_none o document_segment) segments' then ()
|
|
||||||
else
|
|
||||||
let
|
|
||||||
val thy_name = Context.theory_name thy;
|
|
||||||
val latex = Latex.isabelle_body thy_name body;
|
|
||||||
in Export.export thy \<^path_binding>\<open>document/latex_dof\<close> latex end
|
|
||||||
end));
|
|
||||||
|
|
||||||
in
|
|
||||||
|
|
||||||
fun document_command (name, pos) descr mark cmd =
|
fun document_command (name, pos) descr mark cmd =
|
||||||
(Outer_Syntax.command (name, pos) descr
|
Outer_Syntax.command (name, pos) descr
|
||||||
(ODL_Meta_Args_Parser.attributes -- Parse.document_source >>
|
(ODL_Meta_Args_Parser.attributes -- Parse.document_source >> (fn (meta_args, text) =>
|
||||||
(fn (meta_args, text) =>
|
Toplevel.theory' (fn _ => cmd meta_args)
|
||||||
Toplevel.theory (fn thy =>
|
(Toplevel.presentation_context #> document_output_reports name mark meta_args text #> SOME)));
|
||||||
let
|
|
||||||
val thy' = cmd meta_args thy;
|
|
||||||
val _ =
|
|
||||||
(case get_document_command thy' name of
|
|
||||||
SOME pr => ignore (pr meta_args text (Proof_Context.init_global thy'))
|
|
||||||
| NONE => ());
|
|
||||||
in thy' end)));
|
|
||||||
(Theory.setup o Document_Commands.map)
|
|
||||||
(AList.update (op =) (name, document_output_reports name mark)));
|
|
||||||
|
|
||||||
fun document_command' (name, pos) descr mark =
|
|
||||||
(Outer_Syntax.command (name, pos) descr
|
|
||||||
(ODL_Meta_Args_Parser.attributes -- Parse.document_source >>
|
|
||||||
(fn (meta_args, text) =>
|
|
||||||
Toplevel.theory (fn thy =>
|
|
||||||
let
|
|
||||||
val _ =
|
|
||||||
(case get_document_command thy name of
|
|
||||||
SOME pr => ignore (pr meta_args text (Proof_Context.init_global thy))
|
|
||||||
| NONE => ());
|
|
||||||
in thy end)));
|
|
||||||
(Theory.setup o Document_Commands.map)
|
|
||||||
(AList.update (op =) (name, document_output_reports name mark)));
|
|
||||||
|
|
||||||
end;
|
|
||||||
|
|
||||||
|
|
||||||
(* Core Command Definitions *)
|
(* Core Command Definitions *)
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
Outer_Syntax.command @{command_keyword "open_monitor*"}
|
Outer_Syntax.command \<^command_keyword>\<open>open_monitor*\<close>
|
||||||
"open a document reference monitor"
|
"open a document reference monitor"
|
||||||
(ODL_Meta_Args_Parser.attributes
|
(ODL_Meta_Args_Parser.attributes
|
||||||
>> (Toplevel.theory o open_monitor_command));
|
>> (Toplevel.theory o open_monitor_command));
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
Outer_Syntax.command @{command_keyword "close_monitor*"}
|
Outer_Syntax.command \<^command_keyword>\<open>close_monitor*\<close>
|
||||||
"close a document reference monitor"
|
"close a document reference monitor"
|
||||||
(ODL_Meta_Args_Parser.attributes_upd
|
(ODL_Meta_Args_Parser.attributes_upd
|
||||||
>> (Toplevel.theory o close_monitor_command));
|
>> (Toplevel.theory o close_monitor_command));
|
||||||
|
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
Outer_Syntax.command @{command_keyword "update_instance*"}
|
Outer_Syntax.command \<^command_keyword>\<open>update_instance*\<close>
|
||||||
"update meta-attributes of an instance of a document class"
|
"update meta-attributes of an instance of a document class"
|
||||||
(ODL_Meta_Args_Parser.attributes_upd
|
(ODL_Meta_Args_Parser.attributes_upd
|
||||||
>> (Toplevel.theory o update_instance_command));
|
>> (Toplevel.theory o update_instance_command));
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
document_command ("text*", @{here}) "formal comment (primary style)"
|
document_command \<^command_keyword>\<open>text*\<close> "formal comment (primary style)"
|
||||||
{markdown = true, body = true} (gen_enriched_document_cmd {inline=true} I I);
|
{markdown = true, body = true} (gen_enriched_document_cmd {inline=true} I I);
|
||||||
|
|
||||||
|
|
||||||
(* This is just a stub at present *)
|
(* This is just a stub at present *)
|
||||||
val _ =
|
val _ =
|
||||||
document_command ("text-macro*", @{here}) "formal comment macro"
|
document_command \<^command_keyword>\<open>text-macro*\<close> "formal comment macro"
|
||||||
{markdown = true, body = true}
|
{markdown = true, body = true}
|
||||||
(gen_enriched_document_cmd {inline=false} (* declare as macro *) I I);
|
(gen_enriched_document_cmd {inline=false} (* declare as macro *) I I);
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
Outer_Syntax.command @{command_keyword "declare_reference*"}
|
Outer_Syntax.command \<^command_keyword>\<open>declare_reference*\<close>
|
||||||
"declare document reference"
|
"declare document reference"
|
||||||
(ODL_Meta_Args_Parser.attributes >> (fn (((oid,pos),cid),doc_attrs) =>
|
(ODL_Meta_Args_Parser.attributes >> (fn (((oid,pos),cid),doc_attrs) =>
|
||||||
(Toplevel.theory (DOF_core.declare_object_global oid))));
|
(Toplevel.theory (DOF_core.declare_object_global oid))));
|
||||||
|
@ -2272,20 +2234,16 @@ fun meta_args_2_string thy ((((lab, _), cid_opt), attr_list) : ODL_Meta_Args_Par
|
||||||
(* val _ = writeln("meta_args_2_string cid_long:"^ cid_long ) *)
|
(* val _ = writeln("meta_args_2_string cid_long:"^ cid_long ) *)
|
||||||
val cid_txt = "type = " ^ (enclose "{" "}" cid_long);
|
val cid_txt = "type = " ^ (enclose "{" "}" cid_long);
|
||||||
|
|
||||||
fun ltx_of_term _ _ (Const (@{const_name \<open>Cons\<close>},
|
fun ltx_of_term _ _ (c as \<^Const_>\<open>Cons \<^Type>\<open>char\<close> for _ _\<close>) = HOLogic.dest_string c
|
||||||
@{typ "char \<Rightarrow> char list \<Rightarrow> char list"}) $ t1 $ t2)
|
| ltx_of_term _ _ \<^Const_>\<open>Nil _\<close> = ""
|
||||||
= HOLogic.dest_string (Const (@{const_name \<open>Cons\<close>},
|
| ltx_of_term _ _ \<^Const_>\<open>numeral _ for t\<close> = Int.toString(HOLogic.dest_numeral t)
|
||||||
@{typ "char \<Rightarrow> char list \<Rightarrow> char list"}) $ t1 $ t2)
|
| ltx_of_term ctx encl \<^Const_>\<open>Cons _ for t1 t2\<close> =
|
||||||
| ltx_of_term _ _ (Const (@{const_name \<open>Nil\<close>}, _)) = ""
|
let val inner = (case t2 of
|
||||||
| ltx_of_term _ _ (@{term "numeral :: _ \<Rightarrow> _"} $ t) = Int.toString(HOLogic.dest_numeral t)
|
\<^Const_>\<open>Nil _\<close> => ltx_of_term ctx true t1
|
||||||
| ltx_of_term ctx encl ((Const (@{const_name \<open>Cons\<close>}, _) $ t1) $ t2) =
|
| _ => ((ltx_of_term ctx false t1)^", " ^(ltx_of_term ctx false t2)))
|
||||||
let val inner = (case t2 of
|
|
||||||
Const (@{const_name \<open>Nil\<close>}, _) => (ltx_of_term ctx true t1)
|
|
||||||
| _ => ((ltx_of_term ctx false t1)^", " ^(ltx_of_term ctx false t2))
|
|
||||||
)
|
|
||||||
in if encl then enclose "{" "}" inner else inner end
|
in if encl then enclose "{" "}" inner else inner end
|
||||||
| ltx_of_term _ _ (Const (@{const_name \<open>None\<close>}, _)) = ""
|
| ltx_of_term _ _ \<^Const_>\<open>None _\<close> = ""
|
||||||
| ltx_of_term ctxt _ (Const (@{const_name \<open>Some\<close>}, _)$t) = ltx_of_term ctxt true t
|
| ltx_of_term ctxt _ \<^Const_>\<open>Some _ for t\<close> = ltx_of_term ctxt true t
|
||||||
| ltx_of_term ctxt _ t = ""^(Sledgehammer_Util.hackish_string_of_term ctxt t)
|
| ltx_of_term ctxt _ t = ""^(Sledgehammer_Util.hackish_string_of_term ctxt t)
|
||||||
|
|
||||||
|
|
||||||
|
@ -2444,34 +2402,15 @@ struct
|
||||||
val basic_entity = Document_Output.antiquotation_pretty_source
|
val basic_entity = Document_Output.antiquotation_pretty_source
|
||||||
: binding -> 'a context_parser -> (Proof.context -> 'a -> Pretty.T) -> theory -> theory;
|
: binding -> 'a context_parser -> (Proof.context -> 'a -> Pretty.T) -> theory -> theory;
|
||||||
|
|
||||||
fun symbex_attr_access0 ctxt proj_term term =
|
fun compute_trace_ML ctxt oid pos_opt pos' =
|
||||||
let val [subterm'] = Type_Infer_Context.infer_types ctxt [proj_term $ term]
|
|
||||||
in Value_Command.value ctxt (subterm') end
|
|
||||||
|
|
||||||
|
|
||||||
fun compute_attr_access ctxt attr oid pos pos' = (* template *)
|
|
||||||
case DOF_core.get_value_global oid (Context.theory_of ctxt) of
|
|
||||||
SOME term => let val ctxt = (Proof_Context.init_global (Context.theory_of ctxt))
|
|
||||||
val SOME{cid,pos=pos_decl,id,...} = DOF_core.get_object_local oid ctxt
|
|
||||||
val markup = docref_markup false oid id pos_decl;
|
|
||||||
val _ = Context_Position.report ctxt pos' markup;
|
|
||||||
val (* (long_cid, attr_b,ty) = *)
|
|
||||||
{long_name, typ=ty,...} =
|
|
||||||
case DOF_core.get_attribute_info_local cid attr ctxt of
|
|
||||||
SOME f => f
|
|
||||||
| NONE => error("attribute undefined for reference: "
|
|
||||||
^ oid ^ Position.here pos)
|
|
||||||
val proj_term = Const(long_name,dummyT --> ty)
|
|
||||||
in symbex_attr_access0 ctxt proj_term term end
|
|
||||||
(*in Value_Command.value ctxt term end*)
|
|
||||||
| NONE => error("identifier not a docitem reference" ^ Position.here pos)
|
|
||||||
|
|
||||||
|
|
||||||
fun compute_trace_ML ctxt oid pos pos' =
|
|
||||||
(* grabs attribute, and converts its HOL-term into (textual) ML representation *)
|
(* grabs attribute, and converts its HOL-term into (textual) ML representation *)
|
||||||
let val term = compute_attr_access ctxt "trace" oid pos pos'
|
let val term = ISA_core.compute_attr_access ctxt "trace" oid pos_opt pos'
|
||||||
fun conv (Const(@{const_name "Pair"},_) $ Const(s,_) $ S) = (s, HOLogic.dest_string S)
|
fun conv (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
|
||||||
in map conv (HOLogic.dest_list term) end
|
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
|
||||||
|
let val s' = DOF_core.read_cid (Context.proof_of ctxt) (HOLogic.dest_string s)
|
||||||
|
in (s', HOLogic.dest_string S) end
|
||||||
|
in map conv (HOLogic.dest_list term) end
|
||||||
|
|
||||||
|
|
||||||
val parse_oid = Scan.lift(Parse.position Args.name)
|
val parse_oid = Scan.lift(Parse.position Args.name)
|
||||||
val parse_cid = Scan.lift(Parse.position Args.name)
|
val parse_cid = Scan.lift(Parse.position Args.name)
|
||||||
|
@ -2490,7 +2429,7 @@ val parse_attribute_access' = Term_Style.parse -- parse_attribute_access
|
||||||
((string * Position.T) * (string * Position.T))) context_parser
|
((string * Position.T) * (string * Position.T))) context_parser
|
||||||
|
|
||||||
fun attr_2_ML ctxt ((attr:string,pos),(oid:string,pos')) = (ML_Syntax.atomic o ML_Syntax.print_term)
|
fun attr_2_ML ctxt ((attr:string,pos),(oid:string,pos')) = (ML_Syntax.atomic o ML_Syntax.print_term)
|
||||||
(compute_attr_access ctxt attr oid pos pos')
|
(ISA_core.compute_attr_access ctxt attr oid (SOME pos) pos')
|
||||||
|
|
||||||
|
|
||||||
val TERM_STORE = let val dummy_term = Bound 0
|
val TERM_STORE = let val dummy_term = Bound 0
|
||||||
|
@ -2507,7 +2446,7 @@ fun get_instance_value_2_ML ctxt (oid:string,pos) =
|
||||||
fun trace_attr_2_ML ctxt (oid:string,pos) =
|
fun trace_attr_2_ML ctxt (oid:string,pos) =
|
||||||
let val print_string_pair = ML_Syntax.print_pair ML_Syntax.print_string ML_Syntax.print_string
|
let val print_string_pair = ML_Syntax.print_pair ML_Syntax.print_string ML_Syntax.print_string
|
||||||
val toML = (ML_Syntax.atomic o (ML_Syntax.print_list print_string_pair))
|
val toML = (ML_Syntax.atomic o (ML_Syntax.print_list print_string_pair))
|
||||||
in toML (compute_trace_ML ctxt oid @{here} pos) end
|
in toML (compute_trace_ML ctxt oid NONE pos) end
|
||||||
|
|
||||||
fun compute_cid_repr ctxt cid pos =
|
fun compute_cid_repr ctxt cid pos =
|
||||||
if DOF_core.is_defined_cid_local cid ctxt then Const(cid,dummyT)
|
if DOF_core.is_defined_cid_local cid ctxt then Const(cid,dummyT)
|
||||||
|
@ -2515,12 +2454,12 @@ fun compute_cid_repr ctxt cid pos =
|
||||||
|
|
||||||
local
|
local
|
||||||
|
|
||||||
fun pretty_attr_access_style ctxt (style, ((oid,pos),(attr,pos'))) =
|
fun pretty_attr_access_style ctxt (style, ((attr,pos),(oid,pos'))) =
|
||||||
Document_Output.pretty_term ctxt (style (compute_attr_access (Context.Proof ctxt)
|
Document_Output.pretty_term ctxt (style (ISA_core.compute_attr_access (Context.Proof ctxt)
|
||||||
attr oid pos pos'));
|
attr oid (SOME pos) pos'));
|
||||||
fun pretty_trace_style ctxt (style, (oid,pos)) =
|
fun pretty_trace_style ctxt (style, (oid,pos)) =
|
||||||
Document_Output.pretty_term ctxt (style (compute_attr_access (Context.Proof ctxt)
|
Document_Output.pretty_term ctxt (style (ISA_core.compute_attr_access (Context.Proof ctxt)
|
||||||
"trace" oid pos pos));
|
"trace" oid NONE pos));
|
||||||
fun pretty_cid_style ctxt (style, (cid,pos)) =
|
fun pretty_cid_style ctxt (style, (cid,pos)) =
|
||||||
Document_Output.pretty_term ctxt (style (compute_cid_repr ctxt cid pos));
|
Document_Output.pretty_term ctxt (style (compute_cid_repr ctxt cid pos));
|
||||||
|
|
||||||
|
@ -2600,7 +2539,7 @@ fun read_fields raw_fields ctxt =
|
||||||
val ctxt' = fold Variable.declare_typ Ts ctxt;
|
val ctxt' = fold Variable.declare_typ Ts ctxt;
|
||||||
in (fields, terms, ctxt') end;
|
in (fields, terms, ctxt') end;
|
||||||
|
|
||||||
val trace_attr = ((Binding.make("trace",@{here}), "(doc_class rexp \<times> string) list",Mixfix.NoSyn),
|
val trace_attr = ((\<^binding>\<open>trace\<close>, "(doc_class rexp \<times> string) list",Mixfix.NoSyn),
|
||||||
SOME "[]"): ((binding * string * mixfix) * string option)
|
SOME "[]"): ((binding * string * mixfix) * string option)
|
||||||
|
|
||||||
fun def_cmd (decl, spec, prems, params) lthy =
|
fun def_cmd (decl, spec, prems, params) lthy =
|
||||||
|
@ -2612,8 +2551,7 @@ fun def_cmd (decl, spec, prems, params) lthy =
|
||||||
(Frees.defined (Frees.build (Frees.add_frees lhs'))) [(x, T)]
|
(Frees.defined (Frees.build (Frees.add_frees lhs'))) [(x, T)]
|
||||||
in lthy' end
|
in lthy' end
|
||||||
|
|
||||||
fun meta_eq_const T = Const (\<^const_name>\<open>Pure.eq\<close>, T --> T --> propT);
|
fun mk_meta_eq (t, u) = \<^Const>\<open>Pure.eq \<open>fastype_of t\<close> for t u\<close>;
|
||||||
fun mk_meta_eq (t, u) = meta_eq_const (fastype_of t) $ t $ u;
|
|
||||||
|
|
||||||
fun define_cond binding f_sty cond_suffix read_cond (ctxt:local_theory) =
|
fun define_cond binding f_sty cond_suffix read_cond (ctxt:local_theory) =
|
||||||
let val bdg = Binding.suffix_name cond_suffix binding
|
let val bdg = Binding.suffix_name cond_suffix binding
|
||||||
|
@ -2689,7 +2627,13 @@ fun add_doc_class_cmd overloaded (raw_params, binding)
|
||||||
else error("no overloading allowed.")
|
else error("no overloading allowed.")
|
||||||
val record_fields = map_filter (check_n_filter thy) fields
|
val record_fields = map_filter (check_n_filter thy) fields
|
||||||
(* adding const symbol representing doc-class for Monitor-RegExps.*)
|
(* adding const symbol representing doc-class for Monitor-RegExps.*)
|
||||||
in thy |> (Sign.add_consts_cmd [(binding, "doc_class Regular_Exp.rexp", Mixfix.NoSyn)])
|
val constant_typ = \<^typ>\<open>doc_class rexp\<close>
|
||||||
|
val constant_term = \<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close>
|
||||||
|
$ (\<^Const>\<open>mk\<close>
|
||||||
|
$ HOLogic.mk_string (Binding.name_of binding))
|
||||||
|
val eq = mk_meta_eq(Free(Binding.name_of binding, constant_typ), constant_term)
|
||||||
|
val args = (SOME(binding,NONE,NoSyn), (Binding.empty_atts,eq),[],[])
|
||||||
|
in thy |> Named_Target.theory_map (def_cmd args)
|
||||||
|> (fn thy =>
|
|> (fn thy =>
|
||||||
case parent' of
|
case parent' of
|
||||||
NONE => (Record.add_record
|
NONE => (Record.add_record
|
||||||
|
@ -2903,8 +2847,8 @@ fun theory_text_antiquotation name =
|
||||||
|
|
||||||
|
|
||||||
fun environment_delim name =
|
fun environment_delim name =
|
||||||
("%\n\\begin{" ^ Monitor_Command_Parser.output_name name ^ "}\n",
|
("%\n\\begin{" ^ Latex.output_name name ^ "}\n",
|
||||||
"\n\\end{" ^ Monitor_Command_Parser.output_name name ^ "}");
|
"\n\\end{" ^ Latex.output_name name ^ "}");
|
||||||
|
|
||||||
fun environment_block name = environment_delim name |-> XML.enclose;
|
fun environment_block name = environment_delim name |-> XML.enclose;
|
||||||
|
|
||||||
|
@ -2951,7 +2895,165 @@ val _ = Outer_Syntax.command \<^command_keyword>\<open>define_macro*\<close> "d
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
|
|
||||||
|
section \<open>Document context: template and ontology\<close>
|
||||||
|
|
||||||
|
ML \<open>
|
||||||
|
signature DOCUMENT_CONTEXT =
|
||||||
|
sig
|
||||||
|
val template_space: Context.generic -> Name_Space.T
|
||||||
|
val ontology_space: Context.generic -> Name_Space.T
|
||||||
|
val print_template: Context.generic -> string -> string
|
||||||
|
val print_ontology: Context.generic -> string -> string
|
||||||
|
val check_template: Context.generic -> xstring * Position.T -> string * string
|
||||||
|
val check_ontology: Context.generic -> xstring * Position.T -> string * string
|
||||||
|
val define_template: binding * string -> theory -> string * theory
|
||||||
|
val define_ontology: binding * string -> theory -> string * theory
|
||||||
|
val use_template: Context.generic -> xstring * Position.T -> unit
|
||||||
|
val use_ontology: Context.generic -> (xstring * Position.T) list -> unit
|
||||||
|
end;
|
||||||
|
|
||||||
|
structure Document_Context: DOCUMENT_CONTEXT =
|
||||||
|
struct
|
||||||
|
|
||||||
|
(* theory data *)
|
||||||
|
|
||||||
|
local
|
||||||
|
|
||||||
|
structure Data = Theory_Data
|
||||||
|
(
|
||||||
|
type T = string Name_Space.table * string Name_Space.table;
|
||||||
|
val empty : T =
|
||||||
|
(Name_Space.empty_table "document_template",
|
||||||
|
Name_Space.empty_table "document_ontology");
|
||||||
|
fun merge ((templates1, ontologies1), (templates2, ontologies2)) =
|
||||||
|
(Name_Space.merge_tables (templates1, templates2),
|
||||||
|
Name_Space.merge_tables (ontologies1, ontologies2));
|
||||||
|
);
|
||||||
|
|
||||||
|
fun naming_context thy =
|
||||||
|
Proof_Context.init_global thy
|
||||||
|
|> Proof_Context.map_naming (Name_Space.root_path #> Name_Space.add_path "Isabelle_DOF")
|
||||||
|
|> Context.Proof;
|
||||||
|
|
||||||
|
fun get_space which = Name_Space.space_of_table o which o Data.get o Context.theory_of;
|
||||||
|
|
||||||
|
fun print which context =
|
||||||
|
Name_Space.markup_extern (Context.proof_of context) (get_space which context)
|
||||||
|
#> uncurry Markup.markup;
|
||||||
|
|
||||||
|
fun check which context arg =
|
||||||
|
Name_Space.check context (which (Data.get (Context.theory_of context))) arg;
|
||||||
|
|
||||||
|
fun define (get, ap) (binding, arg) thy =
|
||||||
|
let
|
||||||
|
val (name, table') =
|
||||||
|
Data.get thy |> get |> Name_Space.define (naming_context thy) true (binding, arg);
|
||||||
|
val thy' = (Data.map o ap) (K table') thy;
|
||||||
|
in (name, thy') end;
|
||||||
|
|
||||||
|
fun strip prfx sffx (path, pos) =
|
||||||
|
(case try (unprefix prfx) (Path.file_name path) of
|
||||||
|
NONE => error ("File name needs to have prefix " ^ quote prfx ^ Position.here pos)
|
||||||
|
| SOME a =>
|
||||||
|
(case try (unsuffix sffx) a of
|
||||||
|
NONE => error ("File name needs to have suffix " ^ quote sffx ^ Position.here pos)
|
||||||
|
| SOME b => b));
|
||||||
|
|
||||||
|
in
|
||||||
|
|
||||||
|
val template_space = get_space fst;
|
||||||
|
val ontology_space = get_space snd;
|
||||||
|
|
||||||
|
val print_template = print fst;
|
||||||
|
val print_ontology = print snd;
|
||||||
|
|
||||||
|
val check_template = check fst;
|
||||||
|
val check_ontology = check snd;
|
||||||
|
|
||||||
|
val define_template = define (fst, apfst);
|
||||||
|
val define_ontology = define (snd, apsnd);
|
||||||
|
|
||||||
|
fun use_template context arg =
|
||||||
|
let val xml = arg |> check_template context |> snd |> XML.string
|
||||||
|
in Export.export (Context.theory_of context) \<^path_binding>\<open>dof/use_template\<close> xml end;
|
||||||
|
|
||||||
|
fun use_ontology context args =
|
||||||
|
let
|
||||||
|
val xml = args
|
||||||
|
|> map (check_ontology context #> fst #> Long_Name.base_name)
|
||||||
|
|> cat_lines |> XML.string;
|
||||||
|
in Export.export (Context.theory_of context) \<^path_binding>\<open>dof/use_ontology\<close> xml end;
|
||||||
|
|
||||||
|
val strip_template = strip "root-" ".tex";
|
||||||
|
val strip_ontology = strip "DOF-" ".sty";
|
||||||
|
|
||||||
|
end;
|
||||||
|
|
||||||
|
|
||||||
|
(* Isar commands *)
|
||||||
|
|
||||||
|
val _ =
|
||||||
|
Outer_Syntax.command \<^command_keyword>\<open>use_template\<close>
|
||||||
|
"use DOF document template (as defined within theory context)"
|
||||||
|
(Parse.position Parse.name >> (fn arg =>
|
||||||
|
Toplevel.theory (fn thy => (use_template (Context.Theory thy) arg; thy))));
|
||||||
|
|
||||||
|
val _ =
|
||||||
|
Outer_Syntax.command \<^command_keyword>\<open>use_ontology\<close>
|
||||||
|
"use DOF document ontologies (as defined within theory context)"
|
||||||
|
(Parse.and_list1 (Parse.position Parse.name) >> (fn args =>
|
||||||
|
Toplevel.theory (fn thy => (use_ontology (Context.Theory thy) args; thy))));
|
||||||
|
|
||||||
|
val _ =
|
||||||
|
Outer_Syntax.command \<^command_keyword>\<open>define_template\<close>
|
||||||
|
"define DOF document template (via LaTeX root file)"
|
||||||
|
(Parse.position Resources.provide_parse_file >>
|
||||||
|
(fn (get_file, pos) => Toplevel.theory (fn thy =>
|
||||||
|
let
|
||||||
|
val (file, thy') = get_file thy;
|
||||||
|
val binding = Binding.make (strip_template (#src_path file, pos), pos);
|
||||||
|
val text = cat_lines (#lines file);
|
||||||
|
in #2 (define_template (binding, text) thy') end)));
|
||||||
|
|
||||||
|
val _ =
|
||||||
|
Outer_Syntax.command \<^command_keyword>\<open>define_ontology\<close>
|
||||||
|
"define DOF document ontology (via LaTeX style file)"
|
||||||
|
(Parse.position Resources.provide_parse_file >>
|
||||||
|
(fn (get_file, pos) => Toplevel.theory (fn thy =>
|
||||||
|
let
|
||||||
|
val (file, thy') = get_file thy;
|
||||||
|
val binding = Binding.make (strip_ontology (#src_path file, pos), pos);
|
||||||
|
val text = cat_lines (#lines file);
|
||||||
|
in #2 (define_ontology (binding, text) thy') end)));
|
||||||
|
|
||||||
|
end;
|
||||||
|
\<close>
|
||||||
|
|
||||||
|
define_template "../document-templates/root-eptcs-UNSUPPORTED.tex"
|
||||||
|
define_template "../document-templates/root-lipics-v2021-UNSUPPORTED.tex"
|
||||||
|
define_template "../document-templates/root-lncs.tex"
|
||||||
|
define_template "../document-templates/root-scrartcl.tex"
|
||||||
|
define_template "../document-templates/root-scrreprt-modern.tex"
|
||||||
|
define_template "../document-templates/root-scrreprt.tex"
|
||||||
|
define_template "../document-templates/root-svjour3-UNSUPPORTED.tex"
|
||||||
|
|
||||||
|
|
||||||
|
section \<open>Isabelle/Scala module within session context\<close>
|
||||||
|
|
||||||
|
external_file "../../etc/build.props"
|
||||||
|
external_file "../scala/dof_document_build.scala"
|
||||||
|
external_file "../scala/dof_mkroot.scala"
|
||||||
|
external_file "../scala/dof.scala"
|
||||||
|
external_file "../scala/dof_tools.scala"
|
||||||
|
|
||||||
|
scala_build_generated_files
|
||||||
|
external_files
|
||||||
|
"build.props" (in "../../etc")
|
||||||
|
and
|
||||||
|
"src/scala/dof_document_build.scala"
|
||||||
|
"src/scala/dof_mkroot.scala"
|
||||||
|
"src/scala/dof.scala"
|
||||||
|
"src/scala/dof_tools.scala" (in "../..")
|
||||||
|
|
||||||
(*
|
(*
|
||||||
ML\<open>
|
ML\<open>
|
||||||
|
|
|
@ -178,7 +178,7 @@ local open RegExpChecker in
|
||||||
|
|
||||||
type automaton = state * ((Int.int -> state -> state) * (state -> bool))
|
type automaton = state * ((Int.int -> state -> state) * (state -> bool))
|
||||||
|
|
||||||
val add_atom = fold_aterms (fn Const(c as(_,Type(@{type_name "rexp"},_)))=> insert (op=) c |_=>I);
|
val add_atom = fold_aterms (fn Const (c as (_, \<^Type>\<open>rexp _\<close>)) => insert (op=) c | _=> I);
|
||||||
fun alphabet termS = rev(map fst (fold add_atom termS []));
|
fun alphabet termS = rev(map fst (fold add_atom termS []));
|
||||||
fun ext_alphabet env termS =
|
fun ext_alphabet env termS =
|
||||||
let val res = rev(map fst (fold add_atom termS [])) @ env;
|
let val res = rev(map fst (fold add_atom termS [])) @ env;
|
||||||
|
@ -187,14 +187,14 @@ local open RegExpChecker in
|
||||||
else ()
|
else ()
|
||||||
in res end;
|
in res end;
|
||||||
|
|
||||||
fun conv (Const(@{const_name "Regular_Exp.rexp.Zero"},_)) _ = Zero
|
fun conv \<^Const_>\<open>Regular_Exp.rexp.Zero _\<close> _ = Zero
|
||||||
|conv (Const(@{const_name "Regular_Exp.rexp.One"},_)) _ = Onea
|
|conv \<^Const_>\<open>Regular_Exp.rexp.One _\<close> _ = Onea
|
||||||
|conv (Const(@{const_name "Regular_Exp.rexp.Times"},_) $ X $ Y) env = Times(conv X env, conv Y env)
|
|conv \<^Const_>\<open>Regular_Exp.rexp.Times _ for X Y\<close> env = Times(conv X env, conv Y env)
|
||||||
|conv (Const(@{const_name "Regular_Exp.rexp.Plus"},_) $ X $ Y) env = Plus(conv X env, conv Y env)
|
|conv \<^Const_>\<open>Regular_Exp.rexp.Plus _ for X Y\<close> env = Plus(conv X env, conv Y env)
|
||||||
|conv (Const(@{const_name "Regular_Exp.rexp.Star"},_) $ X) env = Star(conv X env)
|
|conv \<^Const_>\<open>Regular_Exp.rexp.Star _ for X\<close> env = Star(conv X env)
|
||||||
|conv (Const(@{const_name "RegExpInterface.opt"},_) $ X) env = Plus(conv X env, Onea)
|
|conv \<^Const_>\<open>RegExpInterface.opt _ for X\<close> env = Plus(conv X env, Onea)
|
||||||
|conv (Const(@{const_name "RegExpInterface.rep1"},_) $ X) env = Times(conv X env, Star(conv X env))
|
|conv \<^Const_>\<open>RegExpInterface.rep1 _ for X\<close> env = Times(conv X env, Star(conv X env))
|
||||||
|conv (Const (s, Type(@{type_name "rexp"},_))) env =
|
|conv (Const (s, \<^Type>\<open>rexp _\<close>)) env =
|
||||||
let val n = find_index (fn x => x = s) env
|
let val n = find_index (fn x => x = s) env
|
||||||
val _ = if n<0 then error"conversion error of regexp." else ()
|
val _ = if n<0 then error"conversion error of regexp." else ()
|
||||||
in Atom(n) end
|
in Atom(n) end
|
||||||
|
|
2
src/ROOT
2
src/ROOT
|
@ -14,4 +14,6 @@ session "Isabelle_DOF" = "Functional-Automata" +
|
||||||
theories
|
theories
|
||||||
"DOF/Isa_DOF"
|
"DOF/Isa_DOF"
|
||||||
"ontologies/ontologies"
|
"ontologies/ontologies"
|
||||||
|
export_classpath
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -19,8 +19,6 @@
|
||||||
%% you need to download lipics.cls from
|
%% you need to download lipics.cls from
|
||||||
%% https://www.dagstuhl.de/en/publications/lipics/instructions-for-authors/
|
%% https://www.dagstuhl.de/en/publications/lipics/instructions-for-authors/
|
||||||
%% and add it manually to the praemble.tex and the ROOT file.
|
%% and add it manually to the praemble.tex and the ROOT file.
|
||||||
%% Moreover, the option "document_comment_latex=true" needs to be set
|
|
||||||
%% in the ROOT file.
|
|
||||||
%%
|
%%
|
||||||
%% All customization and/or additional packages should be added to the file
|
%% All customization and/or additional packages should be added to the file
|
||||||
%% preamble.tex.
|
%% preamble.tex.
|
||||||
|
|
|
@ -28,6 +28,8 @@ theory CENELEC_50128
|
||||||
imports "Isabelle_DOF.technical_report"
|
imports "Isabelle_DOF.technical_report"
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
define_ontology "DOF-CENELEC_50128.sty"
|
||||||
|
|
||||||
(* this is a hack and should go into an own ontology, providing thingsd like:
|
(* this is a hack and should go into an own ontology, providing thingsd like:
|
||||||
- Assumption*
|
- Assumption*
|
||||||
- Hypothesis*
|
- Hypothesis*
|
||||||
|
@ -1009,7 +1011,7 @@ fun check_sil oid _ ctxt =
|
||||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||||
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
|
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
|
||||||
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
|
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
|
||||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||||
fun check_sil'' [] = true
|
fun check_sil'' [] = true
|
||||||
| check_sil'' (x::xs) =
|
| check_sil'' (x::xs) =
|
||||||
let
|
let
|
||||||
|
@ -1041,7 +1043,7 @@ fun check_sil_slow oid _ ctxt =
|
||||||
val monitor_sil_typ = (Syntax.read_typ ctxt' monitor_cid) --> @{typ "sil"}
|
val monitor_sil_typ = (Syntax.read_typ ctxt' monitor_cid) --> @{typ "sil"}
|
||||||
val monitor_sil = Value_Command.value ctxt'
|
val monitor_sil = Value_Command.value ctxt'
|
||||||
(Const("CENELEC_50128.monitor_SIL.sil", monitor_sil_typ) $ monitor_record_value)
|
(Const("CENELEC_50128.monitor_SIL.sil", monitor_sil_typ) $ monitor_record_value)
|
||||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||||
fun check_sil' [] = true
|
fun check_sil' [] = true
|
||||||
| check_sil' (x::xs) =
|
| check_sil' (x::xs) =
|
||||||
let
|
let
|
||||||
|
@ -1072,7 +1074,7 @@ fun check_required_documents oid _ ctxt =
|
||||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||||
val {monitor_tab,...} = DOF_core.get_data ctxt'
|
val {monitor_tab,...} = DOF_core.get_data ctxt'
|
||||||
val {accepted_cids, ...} = the (Symtab.lookup monitor_tab oid)
|
val {accepted_cids, ...} = the (Symtab.lookup monitor_tab oid)
|
||||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||||
fun check_required_documents' [] = true
|
fun check_required_documents' [] = true
|
||||||
| check_required_documents' (cid::cids) =
|
| check_required_documents' (cid::cids) =
|
||||||
if exists (fn (doc_cid, _) => equal cid doc_cid) traces
|
if exists (fn (doc_cid, _) => equal cid doc_cid) traces
|
||||||
|
|
|
@ -28,7 +28,7 @@ doc_class A =
|
||||||
subsection\<open>Excursion: On the semantic consequences of this definition: \<close>
|
subsection\<open>Excursion: On the semantic consequences of this definition: \<close>
|
||||||
|
|
||||||
text\<open>This class definition leads an implicit Isabelle/HOL \<^theory_text>\<open>record\<close> definition
|
text\<open>This class definition leads an implicit Isabelle/HOL \<^theory_text>\<open>record\<close> definition
|
||||||
(cf. \<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>, chapter 11.6.).
|
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, chapter 11.6.).
|
||||||
Consequently, \<^theory_text>\<open>doc_class\<close>'es inherit the entire theory-infrastructure from Isabelle records:
|
Consequently, \<^theory_text>\<open>doc_class\<close>'es inherit the entire theory-infrastructure from Isabelle records:
|
||||||
\<^enum> there is a HOL-type \<^typ>\<open>A\<close> and its extensible version \<^typ>\<open>'a A_scheme\<close>
|
\<^enum> there is a HOL-type \<^typ>\<open>A\<close> and its extensible version \<^typ>\<open>'a A_scheme\<close>
|
||||||
\<^enum> there are HOL-terms representing \<^emph>\<open>doc_class instances\<close> with the high-level syntax:
|
\<^enum> there are HOL-terms representing \<^emph>\<open>doc_class instances\<close> with the high-level syntax:
|
||||||
|
|
|
@ -17,9 +17,11 @@ theory scholarly_paper
|
||||||
imports "Isabelle_DOF.Isa_COL"
|
imports "Isabelle_DOF.Isa_COL"
|
||||||
keywords "author*" "abstract*"
|
keywords "author*" "abstract*"
|
||||||
"Definition*" "Lemma*" "Theorem*" :: document_body
|
"Definition*" "Lemma*" "Theorem*" :: document_body
|
||||||
|
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
define_ontology "DOF-scholarly_paper.sty"
|
||||||
|
define_ontology "DOF-scholarly_paper-thm.sty"
|
||||||
|
|
||||||
text\<open>Scholarly Paper provides a number of standard text - elements for scientific papers.
|
text\<open>Scholarly Paper provides a number of standard text - elements for scientific papers.
|
||||||
They were introduced in the following.\<close>
|
They were introduced in the following.\<close>
|
||||||
|
|
||||||
|
@ -46,12 +48,12 @@ doc_class abstract =
|
||||||
|
|
||||||
ML\<open>
|
ML\<open>
|
||||||
val _ =
|
val _ =
|
||||||
Monitor_Command_Parser.document_command ("abstract*", @{here}) "Textual Definition"
|
Monitor_Command_Parser.document_command \<^command_keyword>\<open>abstract*\<close> "Textual Definition"
|
||||||
{markdown = true, body = true}
|
{markdown = true, body = true}
|
||||||
(Onto_Macros.enriched_document_cmd_exp (SOME "abstract") []);
|
(Onto_Macros.enriched_document_cmd_exp (SOME "abstract") []);
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
Monitor_Command_Parser.document_command ("author*", @{here}) "Textual Definition"
|
Monitor_Command_Parser.document_command \<^command_keyword>\<open>author*\<close> "Textual Definition"
|
||||||
{markdown = true, body = true}
|
{markdown = true, body = true}
|
||||||
(Onto_Macros.enriched_document_cmd_exp (SOME "author") []);
|
(Onto_Macros.enriched_document_cmd_exp (SOME "author") []);
|
||||||
\<close>
|
\<close>
|
||||||
|
@ -67,7 +69,8 @@ which we formalize into:\<close>
|
||||||
doc_class text_section = text_element +
|
doc_class text_section = text_element +
|
||||||
main_author :: "author option" <= None
|
main_author :: "author option" <= None
|
||||||
fixme_list :: "string list" <= "[]"
|
fixme_list :: "string list" <= "[]"
|
||||||
level :: "int option" <= "None"
|
(*level :: "int option" <= "None"*)
|
||||||
|
level :: "int option" <= "Some 0"
|
||||||
(* this attribute enables doc-notation support section* etc.
|
(* this attribute enables doc-notation support section* etc.
|
||||||
we follow LaTeX terminology on levels
|
we follow LaTeX terminology on levels
|
||||||
part = Some -1
|
part = Some -1
|
||||||
|
@ -132,7 +135,8 @@ doc_class technical = text_section +
|
||||||
definition_list :: "string list" <= "[]"
|
definition_list :: "string list" <= "[]"
|
||||||
status :: status <= "description"
|
status :: status <= "description"
|
||||||
formal_results :: "thm list"
|
formal_results :: "thm list"
|
||||||
invariant L1 :: "the (level \<sigma>) > 0"
|
(*invariant L1 :: "(case (level \<sigma>) of None \<Rightarrow> 1 | Some x \<Rightarrow> x) > 0"*)
|
||||||
|
invariant L1 :: " the (level \<sigma>) > 0"
|
||||||
|
|
||||||
type_synonym tc = technical (* technical content *)
|
type_synonym tc = technical (* technical content *)
|
||||||
|
|
||||||
|
@ -289,7 +293,7 @@ setup\<open>Theorem_default_class_setup\<close>
|
||||||
ML\<open> local open ODL_Meta_Args_Parser in
|
ML\<open> local open ODL_Meta_Args_Parser in
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
Monitor_Command_Parser.document_command ("Definition*", @{here}) "Textual Definition"
|
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Definition*\<close> "Textual Definition"
|
||||||
{markdown = true, body = true}
|
{markdown = true, body = true}
|
||||||
(fn meta_args => fn thy =>
|
(fn meta_args => fn thy =>
|
||||||
let
|
let
|
||||||
|
@ -301,7 +305,7 @@ val _ =
|
||||||
end);
|
end);
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
Monitor_Command_Parser.document_command ("Lemma*", @{here}) "Textual Lemma Outline"
|
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Lemma*\<close> "Textual Lemma Outline"
|
||||||
{markdown = true, body = true}
|
{markdown = true, body = true}
|
||||||
(fn meta_args => fn thy =>
|
(fn meta_args => fn thy =>
|
||||||
let
|
let
|
||||||
|
@ -313,7 +317,7 @@ val _ =
|
||||||
end);
|
end);
|
||||||
|
|
||||||
val _ =
|
val _ =
|
||||||
Monitor_Command_Parser.document_command ("Theorem*", @{here}) "Textual Theorem Outline"
|
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Theorem*\<close> "Textual Theorem Outline"
|
||||||
{markdown = true, body = true}
|
{markdown = true, body = true}
|
||||||
(fn meta_args => fn thy =>
|
(fn meta_args => fn thy =>
|
||||||
let
|
let
|
||||||
|
@ -441,6 +445,9 @@ fun group f g cidS [] = []
|
||||||
|group f g cidS (a::S) = case find_first (f a) cidS of
|
|group f g cidS (a::S) = case find_first (f a) cidS of
|
||||||
NONE => [a] :: group f g cidS S
|
NONE => [a] :: group f g cidS S
|
||||||
| SOME cid => let val (pref,suff) = chop_prefix (g cid) S
|
| SOME cid => let val (pref,suff) = chop_prefix (g cid) S
|
||||||
|
val _ = writeln("In group a: " ^ \<^make_string> a)
|
||||||
|
val _ = writeln("In group pref: " ^ \<^make_string> pref)
|
||||||
|
val _ = writeln("In group suff: " ^ \<^make_string> suff)
|
||||||
in (a::pref)::(group f g cidS suff) end;
|
in (a::pref)::(group f g cidS suff) end;
|
||||||
|
|
||||||
fun partition ctxt cidS trace =
|
fun partition ctxt cidS trace =
|
||||||
|
@ -453,15 +460,18 @@ fun dest_option _ (Const (@{const_name "None"}, _)) = NONE
|
||||||
|
|
||||||
in
|
in
|
||||||
|
|
||||||
fun check ctxt cidS mon_id pos =
|
fun check ctxt cidS mon_id pos_opt =
|
||||||
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos @{here}
|
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos_opt @{here}
|
||||||
|
val _ = writeln("In check trace: " ^ \<^make_string> trace)
|
||||||
|
val _ = writeln("In check cidS: " ^ \<^make_string> cidS)
|
||||||
val groups = partition (Context.proof_of ctxt) cidS trace
|
val groups = partition (Context.proof_of ctxt) cidS trace
|
||||||
fun get_level_raw oid = AttributeAccess.compute_attr_access ctxt "level" oid @{here} @{here};
|
val _ = writeln("In check groups: " ^ \<^make_string> groups)
|
||||||
|
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
|
||||||
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
|
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
|
||||||
fun check_level_hd a = case (get_level (snd a)) of
|
fun check_level_hd a = case (get_level (snd a)) of
|
||||||
NONE => error("Invariant violation: leading section" ^ snd a ^
|
NONE => error("Invariant violation: leading section" ^ snd a ^
|
||||||
" must have lowest level")
|
" must have lowest level")
|
||||||
| SOME X => X
|
| SOME X => (writeln("In check_level level: " ^ \<^make_string> X ^ ", a: " ^ \<^make_string> a);X)
|
||||||
fun check_group_elem level_hd a = case (get_level (snd a)) of
|
fun check_group_elem level_hd a = case (get_level (snd a)) of
|
||||||
NONE => true
|
NONE => true
|
||||||
| SOME y => if level_hd <= y then true
|
| SOME y => if level_hd <= y then true
|
||||||
|
@ -484,10 +494,10 @@ end
|
||||||
setup\<open> let val cidS = ["scholarly_paper.introduction","scholarly_paper.technical",
|
setup\<open> let val cidS = ["scholarly_paper.introduction","scholarly_paper.technical",
|
||||||
"scholarly_paper.example", "scholarly_paper.conclusion"];
|
"scholarly_paper.example", "scholarly_paper.conclusion"];
|
||||||
fun body moni_oid _ ctxt = (Scholarly_paper_trace_invariant.check
|
fun body moni_oid _ ctxt = (Scholarly_paper_trace_invariant.check
|
||||||
ctxt cidS moni_oid @{here};
|
ctxt cidS moni_oid NONE;
|
||||||
true)
|
true)
|
||||||
in DOF_core.update_class_invariant "scholarly_paper.article" body end\<close>
|
in DOF_core.update_class_invariant "scholarly_paper.article" body end\<close>
|
||||||
|
|
||||||
ML\<open> \<close>
|
ML\<open> \<close>
|
||||||
|
|
||||||
section\<open>Miscelleous\<close>
|
section\<open>Miscelleous\<close>
|
||||||
|
|
|
@ -65,8 +65,8 @@ doc_class result = technical +
|
||||||
|
|
||||||
|
|
||||||
ML\<open>fun check_invariant_invariant oid {is_monitor:bool} ctxt =
|
ML\<open>fun check_invariant_invariant oid {is_monitor:bool} ctxt =
|
||||||
let val kind_term = AttributeAccess.compute_attr_access ctxt "kind" oid @{here} @{here}
|
let val kind_term = ISA_core.compute_attr_access ctxt "kind" oid NONE @{here}
|
||||||
val property_termS = AttributeAccess.compute_attr_access ctxt "property" oid @{here} @{here}
|
val property_termS = ISA_core.compute_attr_access ctxt "property" oid NONE @{here}
|
||||||
val tS = HOLogic.dest_list property_termS
|
val tS = HOLogic.dest_list property_termS
|
||||||
in case kind_term of
|
in case kind_term of
|
||||||
@{term "proof"} => if not(null tS) then true
|
@{term "proof"} => if not(null tS) then true
|
||||||
|
@ -137,10 +137,10 @@ fun dest_option _ (Const (@{const_name "None"}, _)) = NONE
|
||||||
|
|
||||||
in
|
in
|
||||||
|
|
||||||
fun check ctxt cidS mon_id pos =
|
fun check ctxt cidS mon_id pos_opt =
|
||||||
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos @{here}
|
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos_opt @{here}
|
||||||
val groups = partition (Context.proof_of ctxt) cidS trace
|
val groups = partition (Context.proof_of ctxt) cidS trace
|
||||||
fun get_level_raw oid = AttributeAccess.compute_attr_access ctxt "level" oid @{here} @{here};
|
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
|
||||||
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
|
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
|
||||||
fun check_level_hd a = case (get_level (snd a)) of
|
fun check_level_hd a = case (get_level (snd a)) of
|
||||||
NONE => error("Invariant violation: leading section" ^ snd a ^
|
NONE => error("Invariant violation: leading section" ^ snd a ^
|
||||||
|
@ -165,7 +165,7 @@ end
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
setup\<open> let val cidS = ["small_math.introduction","small_math.technical", "small_math.conclusion"];
|
setup\<open> let val cidS = ["small_math.introduction","small_math.technical", "small_math.conclusion"];
|
||||||
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid @{here};
|
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid NONE;
|
||||||
true)
|
true)
|
||||||
in DOF_core.update_class_invariant "small_math.article" body end\<close>
|
in DOF_core.update_class_invariant "small_math.article" body end\<close>
|
||||||
|
|
||||||
|
|
|
@ -17,6 +17,8 @@ theory technical_report
|
||||||
imports "Isabelle_DOF.scholarly_paper"
|
imports "Isabelle_DOF.scholarly_paper"
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
define_ontology "DOF-technical_report.sty"
|
||||||
|
|
||||||
(* for reports paper: invariant: level \<ge> -1 *)
|
(* for reports paper: invariant: level \<ge> -1 *)
|
||||||
|
|
||||||
section\<open>More Global Text Elements for Reports\<close>
|
section\<open>More Global Text Elements for Reports\<close>
|
||||||
|
|
|
@ -0,0 +1,120 @@
|
||||||
|
/*
|
||||||
|
* Copyright (c)
|
||||||
|
* 2021-2022 The University of Exeter.
|
||||||
|
* 2021-2022 The University of Paris-Saclay.
|
||||||
|
*
|
||||||
|
* Redistribution and use in source and binary forms, with or without
|
||||||
|
* modification, are permitted provided that the following conditions
|
||||||
|
* are met:
|
||||||
|
* 1. Redistributions of source code must retain the above copyright
|
||||||
|
* notice, this list of conditions and the following disclaimer.
|
||||||
|
* 2. Redistributions in binary form must reproduce the above copyright
|
||||||
|
* notice, this list of conditions and the following disclaimer in
|
||||||
|
* the documentation and/or other materials provided with the
|
||||||
|
* distribution.
|
||||||
|
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||||
|
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||||
|
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
|
||||||
|
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
|
||||||
|
* COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||||
|
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||||
|
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||||
|
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
|
||||||
|
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
||||||
|
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
|
||||||
|
* ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
|
||||||
|
* POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
*
|
||||||
|
* SPDX-License-Identifier: BSD-2-Clause
|
||||||
|
*/
|
||||||
|
|
||||||
|
|
||||||
|
/*** constants and parameters for Isabelle/DOF ***/
|
||||||
|
|
||||||
|
package isabelle.dof
|
||||||
|
|
||||||
|
import isabelle._
|
||||||
|
|
||||||
|
|
||||||
|
object DOF {
|
||||||
|
/** parameters **/
|
||||||
|
|
||||||
|
val isabelle_version = "2022"
|
||||||
|
val isabelle_url = "https://isabelle.in.tum.de/website-Isabelle2022"
|
||||||
|
|
||||||
|
val afp_version = "afp-2022-10-27"
|
||||||
|
|
||||||
|
// Isabelle/DOF version: "Unreleased" for development, semantic version for releases
|
||||||
|
val version = "Unreleased"
|
||||||
|
|
||||||
|
val session = "Isabelle_DOF"
|
||||||
|
|
||||||
|
val latest_version = "1.3.0"
|
||||||
|
val latest_isabelle = "Isabelle2021-1"
|
||||||
|
val latest_doi = "10.5281/zenodo.6810799"
|
||||||
|
val generic_doi = "10.5281/zenodo.3370482"
|
||||||
|
|
||||||
|
// Isabelle/DOF source repository
|
||||||
|
val url = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF"
|
||||||
|
|
||||||
|
// Isabelle/DOF release artifacts
|
||||||
|
val artifact_dir = "releases/Isabelle_DOF/Isabelle_DOF"
|
||||||
|
val artifact_host = "artifacts.logicalhacking.com"
|
||||||
|
val artifact_url: String = "https://" + artifact_host + "/" + artifact_dir
|
||||||
|
|
||||||
|
def options(opts: Options): Options = opts + "document_comment_latex"
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
/** Isabelle tool wrapper **/
|
||||||
|
|
||||||
|
sealed case class Parameter(name: String, value: String) {
|
||||||
|
override def toString: String = name
|
||||||
|
|
||||||
|
def print(value_only: Boolean): String =
|
||||||
|
if (value_only) value else name + "=" + value
|
||||||
|
}
|
||||||
|
|
||||||
|
val parameters: List[Parameter] =
|
||||||
|
List(
|
||||||
|
Parameter("isabelle_version", isabelle_version),
|
||||||
|
Parameter("afp_version", afp_version),
|
||||||
|
Parameter("dof_version", version)
|
||||||
|
).sortBy(_.name)
|
||||||
|
|
||||||
|
def print_parameters(names: List[String],
|
||||||
|
all: Boolean = false,
|
||||||
|
value_only: Boolean = false,
|
||||||
|
progress: Progress = new Progress
|
||||||
|
): Unit = {
|
||||||
|
val bad = names.filter(name => !parameters.exists(_.name == name))
|
||||||
|
if (bad.nonEmpty) error("Unknown parameter(s): " + commas_quote(bad))
|
||||||
|
|
||||||
|
val params = if (all) parameters else parameters.filter(p => names.contains(p.name))
|
||||||
|
for (p <- params) progress.echo(p.print(value_only))
|
||||||
|
}
|
||||||
|
|
||||||
|
val isabelle_tool = Isabelle_Tool("dof_param", "print Isabelle/DOF parameters",
|
||||||
|
Scala_Project.here, args =>
|
||||||
|
{
|
||||||
|
var all = false
|
||||||
|
var value_only = false
|
||||||
|
|
||||||
|
val getopts = Getopts("""
|
||||||
|
Usage: isabelle dof_param [OPTIONS] NAMES
|
||||||
|
|
||||||
|
Options are:
|
||||||
|
-a print all parameters
|
||||||
|
-b print values only (default: NAME=VALUE)
|
||||||
|
|
||||||
|
Print given Isabelle/DOF parameters, with names from the list:
|
||||||
|
""" + commas_quote(parameters.map(_.toString)),
|
||||||
|
"a" -> (_ => all = true),
|
||||||
|
"b" -> (_ => value_only = true))
|
||||||
|
|
||||||
|
val names = getopts(args)
|
||||||
|
if (names.isEmpty && !all) getopts.usage()
|
||||||
|
|
||||||
|
print_parameters(names, all = all, value_only = value_only, progress = new Console_Progress)
|
||||||
|
})
|
||||||
|
}
|
|
@ -1,7 +1,7 @@
|
||||||
/*
|
/*
|
||||||
* Copyright (c)
|
* Copyright (c)
|
||||||
* 2021-2022 The University of Exeter.
|
* 2021-2022 The University of Exeter.
|
||||||
* 2021-2022 The University of Paris-Saclay.
|
* 2021-2022 The University of Paris-Saclay.
|
||||||
*
|
*
|
||||||
* Redistribution and use in source and binary forms, with or without
|
* Redistribution and use in source and binary forms, with or without
|
||||||
* modification, are permitted provided that the following conditions
|
* modification, are permitted provided that the following conditions
|
||||||
|
@ -28,6 +28,8 @@
|
||||||
* SPDX-License-Identifier: BSD-2-Clause
|
* SPDX-License-Identifier: BSD-2-Clause
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
/*** document build engine for Isabelle/DOF ***/
|
||||||
|
|
||||||
package isabelle.dof
|
package isabelle.dof
|
||||||
|
|
||||||
import isabelle._
|
import isabelle._
|
||||||
|
@ -37,61 +39,61 @@ object DOF_Document_Build
|
||||||
{
|
{
|
||||||
class Engine extends Document_Build.Bash_Engine("dof")
|
class Engine extends Document_Build.Bash_Engine("dof")
|
||||||
{
|
{
|
||||||
// override def use_build_script: Boolean = true
|
def the_document_entry(context: Document_Build.Context, name: String): Export.Entry = {
|
||||||
|
val entries =
|
||||||
|
for {
|
||||||
|
node_name <- context.document_theories
|
||||||
|
entry <- context.session_context.get(node_name.theory, name)
|
||||||
|
} yield entry
|
||||||
|
|
||||||
|
entries match {
|
||||||
|
case List(entry) => entry
|
||||||
|
case Nil =>
|
||||||
|
error("Missing export " + quote(name) + " for document theories of session " +
|
||||||
|
quote(context.session))
|
||||||
|
case dups =>
|
||||||
|
error("Multiple exports " + quote(name) + " for theories " +
|
||||||
|
commas_quote(dups.map(_.theory_name).sorted.distinct))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
override def prepare_directory(
|
override def prepare_directory(
|
||||||
context: Document_Build.Context,
|
context: Document_Build.Context,
|
||||||
dir: Path,
|
dir: Path,
|
||||||
doc: Document_Build.Document_Variant): Document_Build.Directory =
|
doc: Document_Build.Document_Variant): Document_Build.Directory =
|
||||||
{
|
{
|
||||||
val regex = """^.*\.""".r
|
val options = DOF.options(context.options)
|
||||||
val latex_output = new Latex_Output(context.options)
|
val latex_output = new Latex_Output(options)
|
||||||
val directory = context.prepare_directory(dir, doc, latex_output)
|
val directory = context.prepare_directory(dir, doc, latex_output)
|
||||||
|
|
||||||
// produced by alternative presentation hook (workaround for missing Toplevel.present_theory)
|
val isabelle_dof_dir = context.session_context.sessions_structure(DOF.session).dir
|
||||||
for {
|
|
||||||
name <- context.document_theories.iterator
|
|
||||||
entry <- context.session_context.get(name.theory, Export.DOCUMENT_LATEX + "_dof")
|
|
||||||
} {
|
|
||||||
val path = Path.basic(Document_Build.tex_name(name))
|
|
||||||
val xml = YXML.parse_body(entry.text)
|
|
||||||
File.content(path, xml).output(latex_output(_, file_pos = path.implode_symbolic))
|
|
||||||
.write(directory.doc_dir)
|
|
||||||
}
|
|
||||||
val isabelle_dof_dir = context.session_context.sessions_structure("Isabelle_DOF").dir
|
|
||||||
// print(context.options.string("dof_url"));
|
|
||||||
|
|
||||||
// copy Isabelle/DOF LaTeX templates
|
|
||||||
val template_dir = isabelle_dof_dir + Path.explode("document-templates")
|
|
||||||
// TODO: error handling in case 1) template does not exist or 2) root.tex does already exist
|
|
||||||
val template = regex.replaceAllIn(context.options.string("dof_template"),"")
|
|
||||||
Isabelle_System.copy_file(template_dir + Path.explode("root-"+template+".tex"),
|
|
||||||
directory.doc_dir+Path.explode("root.tex"))
|
|
||||||
|
|
||||||
// copy Isabelle/DOF LaTeX styles
|
// LaTeX styles from Isabelle/DOF directory
|
||||||
List(Path.explode("DOF/latex"), Path.explode("ontologies"))
|
List(Path.explode("DOF/latex"), Path.explode("ontologies"))
|
||||||
.flatMap(dir =>
|
.flatMap(dir => File.find_files((isabelle_dof_dir + dir).file, _.getName.endsWith(".sty")))
|
||||||
File.find_files((isabelle_dof_dir + dir).file,
|
|
||||||
file => file.getName.endsWith(".sty"), include_dirs = true))
|
|
||||||
.foreach(sty => Isabelle_System.copy_file(sty, directory.doc_dir.file))
|
.foreach(sty => Isabelle_System.copy_file(sty, directory.doc_dir.file))
|
||||||
|
|
||||||
// create ontology.sty
|
// ontologies.tex from session exports
|
||||||
val ltx_styles = context.options.string("dof_ontologies").split(" +").map(s => regex.replaceAllIn(s,""))
|
File.write(directory.doc_dir + Path.explode("ontologies.tex"),
|
||||||
File.write(directory.doc_dir+Path.explode("ontologies.tex"),
|
split_lines(the_document_entry(context, "dof/use_ontology").text)
|
||||||
ltx_styles.mkString("\\usepackage{DOF-","}\n\\usepackage{DOF-","}\n"))
|
.map(name => "\\usepackage{DOF-" + name + "}\n").mkString)
|
||||||
|
|
||||||
// create dof-config.sty
|
// root.tex from session exports
|
||||||
File.write(directory.doc_dir+Path.explode("dof-config.sty"), """
|
File.write(directory.doc_dir + Path.explode("root.tex"),
|
||||||
\newcommand{\isabelleurl}{https://isabelle.in.tum.de/website-Isabelle2022/""" + context.options.string("dof_isabelle") + """}
|
the_document_entry(context, "dof/use_template").text)
|
||||||
\newcommand{\dofurl}{""" + context.options.string("dof_url") + """}
|
|
||||||
\newcommand{\dof@isabelleversion}{""" + context.options.string("dof_isabelle") + """}
|
// dof-config.sty
|
||||||
\newcommand{\isabellefullversion}{""" + context.options.string("dof_isabelle") + """\xspace}
|
File.write(directory.doc_dir + Path.explode("dof-config.sty"), """
|
||||||
\newcommand{\dof@version}{""" + context.options.string("dof_version") + """}
|
\newcommand{\isabelleurl}{""" + DOF.isabelle_url + """}
|
||||||
\newcommand{\dof@artifacturl}{""" + context.options.string("dof_artifact_dir") + """}
|
\newcommand{\dofurl}{""" + DOF.url + """}
|
||||||
\newcommand{\doflatestversion}{""" + context.options.string("dof_latest_version") + """}
|
\newcommand{\dof@isabelleversion}{""" + DOF.isabelle_version + """}
|
||||||
\newcommand{\isadoflatestdoi}{""" + context.options.string("dof_latest_doi") + """}
|
\newcommand{\isabellefullversion}{""" + DOF.isabelle_version + """\xspace}
|
||||||
\newcommand{\isadofgenericdoi}{""" + context.options.string("dof_generic_doi") + """}
|
\newcommand{\dof@version}{""" + DOF.version + """}
|
||||||
\newcommand{\isabellelatestversion}{""" + context.options.string("dof_latest_isabelle") + """}
|
\newcommand{\dof@artifacturl}{""" + DOF.artifact_url + """}
|
||||||
|
\newcommand{\doflatestversion}{""" + DOF.latest_version + """}
|
||||||
|
\newcommand{\isadoflatestdoi}{""" + DOF.latest_doi + """}
|
||||||
|
\newcommand{\isadofgenericdoi}{""" + DOF.generic_doi + """}
|
||||||
|
\newcommand{\isabellelatestversion}{""" + DOF.latest_isabelle + """}
|
||||||
""")
|
""")
|
||||||
directory
|
directory
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
/*
|
/*
|
||||||
* Copyright (c)
|
* Copyright (c)
|
||||||
* 2021-2022 The University of Exeter.
|
* 2021-2022 The University of Exeter.
|
||||||
* 2021-2022 The University of Paris-Saclay.
|
* 2021-2022 The University of Paris-Saclay.
|
||||||
*
|
*
|
||||||
* Redistribution and use in source and binary forms, with or without
|
* Redistribution and use in source and binary forms, with or without
|
||||||
* modification, are permitted provided that the following conditions
|
* modification, are permitted provided that the following conditions
|
||||||
|
@ -29,32 +29,32 @@
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
|
||||||
/* Author: Makarius
|
/*** create session root directory for Isabelle/DOF ***/
|
||||||
|
|
||||||
Prepare session root directory for Isabelle/DOF.
|
|
||||||
*/
|
|
||||||
|
|
||||||
package isabelle.dof
|
package isabelle.dof
|
||||||
|
|
||||||
import isabelle._
|
import isabelle._
|
||||||
|
|
||||||
|
|
||||||
object DOF_Mkroot
|
object DOF_Mkroot
|
||||||
{
|
{
|
||||||
/** mkroot **/
|
/** mkroot **/
|
||||||
|
|
||||||
|
val default_ontologies: List[String] = List("technical_report", "scholarly_paper")
|
||||||
|
val default_template = "scrreprt-modern"
|
||||||
|
|
||||||
def mkroot(
|
def mkroot(
|
||||||
session_name: String = "",
|
session_name: String = "",
|
||||||
session_dir: Path = Path.current,
|
session_dir: Path = Path.current,
|
||||||
session_parent: String = "",
|
|
||||||
init_repos: Boolean = false,
|
init_repos: Boolean = false,
|
||||||
template: String = "",
|
ontologies: List[String] = default_ontologies,
|
||||||
ontologies: List[String] = List(),
|
template: String = default_template,
|
||||||
|
quiet: Boolean = false,
|
||||||
progress: Progress = new Progress): Unit =
|
progress: Progress = new Progress): Unit =
|
||||||
{
|
{
|
||||||
Isabelle_System.make_directory(session_dir)
|
Isabelle_System.make_directory(session_dir)
|
||||||
|
|
||||||
val name = proper_string(session_name) getOrElse session_dir.absolute_file.getName
|
val name = proper_string(session_name) getOrElse session_dir.absolute_file.getName
|
||||||
val parent = proper_string(session_parent) getOrElse Isabelle_System.getenv("ISABELLE_LOGIC")
|
|
||||||
|
|
||||||
val root_path = session_dir + Sessions.ROOT
|
val root_path = session_dir + Sessions.ROOT
|
||||||
if (root_path.file.exists) error("Cannot overwrite existing " + root_path)
|
if (root_path.file.exists) error("Cannot overwrite existing " + root_path)
|
||||||
|
@ -62,47 +62,54 @@ object DOF_Mkroot
|
||||||
val document_path = session_dir + Path.explode("document")
|
val document_path = session_dir + Path.explode("document")
|
||||||
if (document_path.file.exists) error("Cannot overwrite existing " + document_path)
|
if (document_path.file.exists) error("Cannot overwrite existing " + document_path)
|
||||||
|
|
||||||
progress.echo("\nPreparing session " + quote(name) + " in " + session_dir)
|
progress.echo(
|
||||||
|
(if (quiet) "" else "\n") +
|
||||||
|
"Initializing session " + quote(name) + " in " + session_dir.absolute)
|
||||||
|
|
||||||
|
|
||||||
/* ROOT */
|
/* ROOT */
|
||||||
|
|
||||||
progress.echo(" creating " + root_path)
|
progress.echo_if(!quiet, " creating " + root_path)
|
||||||
|
|
||||||
File.write(root_path,
|
File.write(root_path,
|
||||||
"session " + Mkroot.root_name(name) + " = " + Mkroot.root_name(parent) + """ +
|
"session " + Mkroot.root_name(name) + " = " + Mkroot.root_name(DOF.session) + """ +
|
||||||
options [document = pdf, document_output = "output", document_build = dof, dof_ontologies = """"
|
options [document = pdf, document_output = "output", document_build = dof]
|
||||||
+ ontologies.mkString(" ") + """", dof_template = """ + Mkroot.root_name(template)
|
|
||||||
+ """, document_comment_latex=true]
|
|
||||||
(*theories [document = false]
|
(*theories [document = false]
|
||||||
A
|
A
|
||||||
B*)
|
B*)
|
||||||
theories
|
theories
|
||||||
""" + Mkroot.root_name(name) + """
|
""" + Mkroot.root_name(name) + """
|
||||||
document_files
|
document_files
|
||||||
"preamble.tex"
|
"preamble.tex"
|
||||||
""")
|
""")
|
||||||
|
|
||||||
val thy = session_dir + Path.explode(name+".thy")
|
val thy = session_dir + Path.explode(name + ".thy")
|
||||||
progress.echo(" creating " + thy)
|
progress.echo_if(!quiet, " creating " + thy)
|
||||||
File.write(thy,
|
File.write(thy,
|
||||||
"theory\n " + name + "\nimports\n " + ontologies.mkString("\n ") + """
|
"theory\n " + name +
|
||||||
|
"\nimports\n " + ontologies.map("Isabelle_DOF." + _).mkString("\n ") + """
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
use_template """ + quote(template) + """
|
||||||
|
use_ontology """ + ontologies.map(quote).mkString(" and ") + """
|
||||||
|
|
||||||
end
|
end
|
||||||
""")
|
""")
|
||||||
|
|
||||||
|
|
||||||
|
/* preamble */
|
||||||
|
|
||||||
val preamble_tex = session_dir + Path.explode("document/preamble.tex")
|
val preamble_tex = session_dir + Path.explode("document/preamble.tex")
|
||||||
progress.echo(" creating " + preamble_tex)
|
progress.echo_if(!quiet, " creating " + preamble_tex)
|
||||||
Isabelle_System.make_directory(preamble_tex.dir)
|
Isabelle_System.make_directory(preamble_tex.dir)
|
||||||
File.write(preamble_tex,"""%% This is a placeholder for user-specific configuration and packages.""")
|
File.write(preamble_tex,"""%% This is a placeholder for user-specific configuration and packages.""")
|
||||||
|
|
||||||
|
|
||||||
/* Mercurial repository */
|
/* Mercurial repository */
|
||||||
|
|
||||||
if (init_repos) {
|
if (init_repos) {
|
||||||
progress.echo(" \nInitializing Mercurial repository " + session_dir)
|
progress.echo(
|
||||||
|
(if (quiet) "" else "\n") + "Initializing Mercurial repository " + session_dir.absolute)
|
||||||
|
|
||||||
val hg = Mercurial.init_repository(session_dir)
|
val hg = Mercurial.init_repository(session_dir)
|
||||||
|
|
||||||
|
@ -130,7 +137,7 @@ syntax: regexp
|
||||||
|
|
||||||
{
|
{
|
||||||
val print_dir = session_dir.implode
|
val print_dir = session_dir.implode
|
||||||
progress.echo("""
|
progress.echo_if(!quiet, """
|
||||||
Now use the following command line to build the session:
|
Now use the following command line to build the session:
|
||||||
|
|
||||||
isabelle build -D """ +
|
isabelle build -D """ +
|
||||||
|
@ -139,42 +146,41 @@ Now use the following command line to build the session:
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/** Isabelle tool wrapper **/
|
/** Isabelle tool wrapper **/
|
||||||
|
|
||||||
val isabelle_tool = Isabelle_Tool("dof_mkroot", "prepare session root directory for Isabelle/DOF",
|
val isabelle_tool = Isabelle_Tool("dof_mkroot", "create session root directory for Isabelle/DOF",
|
||||||
Scala_Project.here, args =>
|
Scala_Project.here, args =>
|
||||||
{
|
{
|
||||||
var init_repos = false
|
var init_repos = false
|
||||||
var help = false
|
var help = false
|
||||||
var session_name = ""
|
var session_name = ""
|
||||||
var session_parent = "Isabelle_DOF"
|
var ontologies = default_ontologies
|
||||||
var ontologies:List[String] = List()
|
var template = default_template
|
||||||
var template = session_parent + ".scrartcl"
|
var quiet = false
|
||||||
val default_ontologies = List(session_parent+".scholarly_paper")
|
|
||||||
|
|
||||||
val getopts = Getopts("""
|
val getopts = Getopts("""
|
||||||
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
||||||
|
|
||||||
Options are:
|
Options are:
|
||||||
-I init Mercurial repository and add generated files
|
-I init Mercurial repository and add generated files
|
||||||
|
-h print help
|
||||||
-n NAME alternative session name (default: directory base name)
|
-n NAME alternative session name (default: directory base name)
|
||||||
-o ONTOLOGY ontology (default: scholarly_paper)
|
-o NAMES list of ontologies, separated by blanks
|
||||||
-t TEMPLATE tempalte (default: scrartcl)
|
(default: """ + quote(Word.implode(default_ontologies)) + """)
|
||||||
|
-q quiet mode: less verbosity
|
||||||
|
-t NAME template (default: """ + quote(default_template) + """)
|
||||||
|
|
||||||
Prepare session root directory (default: current directory).
|
Create session root directory for Isabelle/DOF (default: current directory).
|
||||||
""",
|
""",
|
||||||
"I" -> (arg => init_repos = true),
|
"I" -> (_ => init_repos = true),
|
||||||
|
"h" -> (_ => help = true),
|
||||||
"n:" -> (arg => session_name = arg),
|
"n:" -> (arg => session_name = arg),
|
||||||
"o:" -> (arg => ontologies = ontologies :+ arg),
|
"o:" -> (arg => ontologies = Word.explode(arg)),
|
||||||
"t:" -> (arg => template = arg),
|
"q" -> (_ => quiet = true),
|
||||||
"h" -> (arg => help = true)
|
"t:" -> (arg => template = arg))
|
||||||
)
|
|
||||||
|
|
||||||
val more_args = getopts(args)
|
val more_args = getopts(args)
|
||||||
|
|
||||||
ontologies = if (ontologies.isEmpty) default_ontologies else ontologies
|
|
||||||
|
|
||||||
if (help) {getopts.usage()} else {()}
|
|
||||||
val session_dir =
|
val session_dir =
|
||||||
more_args match {
|
more_args match {
|
||||||
case Nil => Path.current
|
case Nil => Path.current
|
||||||
|
@ -182,7 +188,12 @@ Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
|
||||||
case _ => getopts.usage()
|
case _ => getopts.usage()
|
||||||
}
|
}
|
||||||
|
|
||||||
mkroot(session_parent=session_parent, session_name = session_name, session_dir = session_dir, init_repos = init_repos,
|
if (help) getopts.usage()
|
||||||
ontologies = ontologies, template = template, progress = new Console_Progress)
|
|
||||||
|
val progress = new Console_Progress
|
||||||
|
|
||||||
|
mkroot(session_name = session_name, session_dir = session_dir,
|
||||||
|
init_repos = init_repos, quiet = quiet, progress = progress,
|
||||||
|
ontologies = ontologies, template = template)
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
|
@ -28,6 +28,8 @@
|
||||||
* SPDX-License-Identifier: BSD-2-Clause
|
* SPDX-License-Identifier: BSD-2-Clause
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
|
||||||
|
/*** command-line tools for Isabelle/DOF ***/
|
||||||
|
|
||||||
package isabelle.dof
|
package isabelle.dof
|
||||||
|
|
||||||
|
@ -35,5 +37,6 @@ import isabelle._
|
||||||
|
|
||||||
|
|
||||||
class DOF_Tools extends Isabelle_Scala_Tools(
|
class DOF_Tools extends Isabelle_Scala_Tools(
|
||||||
DOF_Mkroot.isabelle_tool
|
DOF.isabelle_tool,
|
||||||
|
DOF_Mkroot.isabelle_tool,
|
||||||
)
|
)
|
||||||
|
|
|
@ -37,13 +37,13 @@ ML\<open>
|
||||||
|
|
||||||
find_theorems (60) name:"Conceptual.M."
|
find_theorems (60) name:"Conceptual.M."
|
||||||
|
|
||||||
value [simp]"trace(M.make undefined [] ())"
|
value [simp]"M.trace(M.make undefined [] ())"
|
||||||
value "ok(M.make undefined_AAA [] ())"
|
value "M.ok(M.make undefined_AAA [] ())"
|
||||||
value "trace(M.make undefined_AAA [] ())"
|
value "M.trace(M.make undefined_AAA [] ())"
|
||||||
value "tag_attribute(M.make undefined_AAA [] ())"
|
value "M.tag_attribute(M.make undefined_AAA [] ())"
|
||||||
|
|
||||||
|
|
||||||
value "ok(M.make 0 [] ())"
|
value "M.ok(M.make 0 [] ())"
|
||||||
(*
|
(*
|
||||||
value "ok(M.make undefined [] ())"
|
value "ok(M.make undefined [] ())"
|
||||||
value "ok(M.make 0 [] undefined)"
|
value "ok(M.make 0 [] undefined)"
|
||||||
|
@ -168,7 +168,7 @@ update_instance*[omega::E, y+="[''en'']"]
|
||||||
ML\<open> val s = map HOLogic.dest_string (HOLogic.dest_list @{docitem_attribute y::omega}); \<close>
|
ML\<open> val s = map HOLogic.dest_string (HOLogic.dest_list @{docitem_attribute y::omega}); \<close>
|
||||||
|
|
||||||
subsection\<open> Example text antiquotation:\<close>
|
subsection\<open> Example text antiquotation:\<close>
|
||||||
text\<open> @{docitem_attribute omega::y} \<close>
|
text\<open> @{docitem_attribute y::omega} \<close>
|
||||||
|
|
||||||
|
|
||||||
section\<open>Simulation of a Monitor\<close>
|
section\<open>Simulation of a Monitor\<close>
|
||||||
|
@ -193,6 +193,23 @@ ML \<open>@{trace_attribute figs1}\<close>
|
||||||
text\<open>Resulting trace of figs as text antiquotation:\<close>
|
text\<open>Resulting trace of figs as text antiquotation:\<close>
|
||||||
text\<open>@{trace_attribute figs1}\<close>
|
text\<open>@{trace_attribute figs1}\<close>
|
||||||
|
|
||||||
|
text\<open>Test trace_attribute term antiquotation:\<close>
|
||||||
|
|
||||||
|
term*\<open>map snd @{trace-attribute \<open>figs1\<close>}\<close>
|
||||||
|
value*\<open>map snd @{trace-attribute \<open>figs1\<close>}\<close>
|
||||||
|
term*\<open>map fst @{trace-attribute \<open>aaa\<close>}\<close>
|
||||||
|
value*\<open>map fst @{trace-attribute \<open>aaa\<close>}\<close>
|
||||||
|
value*\<open>(map fst @{trace-attribute \<open>aaa\<close>}) \<close>
|
||||||
|
|
||||||
|
definition example_expression where "example_expression \<equiv> \<lbrace>\<lfloor>''Conceptual.A''\<rfloor> || \<lfloor>''Conceptual.F''\<rfloor>\<rbrace>\<^sup>*"
|
||||||
|
value* \<open> DA.accepts (na2da (rexp2na example_expression)) (map fst @{trace-attribute \<open>aaa\<close>}) \<close>
|
||||||
|
|
||||||
|
definition word_test :: "'a list \<Rightarrow> 'a rexp \<Rightarrow> bool" (infix "is-in" 60)
|
||||||
|
where " w is-in rexp \<equiv> DA.accepts (na2da (rexp2na rexp)) (w)"
|
||||||
|
|
||||||
|
value* \<open> (map fst @{trace-attribute \<open>aaa\<close>}) is-in example_expression \<close>
|
||||||
|
|
||||||
|
|
||||||
(*<*)
|
(*<*)
|
||||||
text\<open>Final Status:\<close>
|
text\<open>Final Status:\<close>
|
||||||
print_doc_items
|
print_doc_items
|
||||||
|
|
|
@ -44,7 +44,7 @@ subsection*[b::A, x = "5"] \<open> Lorem ipsum dolor sit amet, ... \<close>
|
||||||
|
|
||||||
text\<open>Setting a sample invariant, referring to attribute value "x":\<close>
|
text\<open>Setting a sample invariant, referring to attribute value "x":\<close>
|
||||||
ML\<open>fun check_A_invariant oid {is_monitor:bool} ctxt =
|
ML\<open>fun check_A_invariant oid {is_monitor:bool} ctxt =
|
||||||
let val term = AttributeAccess.compute_attr_access ctxt "x" oid @{here} @{here}
|
let val term = ISA_core.compute_attr_access ctxt "x" oid NONE @{here}
|
||||||
val (@{typ "int"},x_value) = HOLogic.dest_number term
|
val (@{typ "int"},x_value) = HOLogic.dest_number term
|
||||||
in if x_value > 5 then error("class A invariant violation") else true end
|
in if x_value > 5 then error("class A invariant violation") else true end
|
||||||
\<close>
|
\<close>
|
||||||
|
@ -80,8 +80,11 @@ to take sub-classing into account:
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
ML\<open>fun check_M_invariant oid {is_monitor} ctxt =
|
ML\<open>fun check_M_invariant oid {is_monitor} ctxt =
|
||||||
let val term = AttributeAccess.compute_attr_access ctxt "trace" oid @{here} @{here}
|
let val term = ISA_core.compute_attr_access ctxt "trace" oid NONE @{here}
|
||||||
fun conv (Const(@{const_name "Pair"},_) $ Const(s,_) $ S) = (s, HOLogic.dest_string S)
|
fun conv (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
|
||||||
|
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
|
||||||
|
let val s' = DOF_core.read_cid (Context.proof_of ctxt) (HOLogic.dest_string s)
|
||||||
|
in (s', HOLogic.dest_string S) end
|
||||||
val string_pair_list = map conv (HOLogic.dest_list term)
|
val string_pair_list = map conv (HOLogic.dest_list term)
|
||||||
val cid_list = map fst string_pair_list
|
val cid_list = map fst string_pair_list
|
||||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||||
|
@ -113,10 +116,15 @@ subsection*[f::E] \<open> Lectus accumsan velit ultrices, ... }\<
|
||||||
section*[f2::E] \<open> Lectus accumsan velit ultrices, ... }\<close>
|
section*[f2::E] \<open> Lectus accumsan velit ultrices, ... }\<close>
|
||||||
*)
|
*)
|
||||||
|
|
||||||
ML\<open>val term = AttributeAccess.compute_attr_access
|
ML\<open>val ctxt = @{context}
|
||||||
(Context.Proof @{context}) "trace" "struct" @{here} @{here} ;
|
val term = ISA_core.compute_attr_access
|
||||||
|
(Context.Proof ctxt) "trace" "struct" NONE @{here} ;
|
||||||
fun conv (Const(@{const_name "Pair"},_) $ Const(s,_) $ S) = (s, HOLogic.dest_string S)
|
fun conv (Const(@{const_name "Pair"},_) $ Const(s,_) $ S) = (s, HOLogic.dest_string S)
|
||||||
val string_pair_list = map conv (HOLogic.dest_list term)
|
fun conv' (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
|
||||||
|
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
|
||||||
|
let val s' = DOF_core.read_cid ctxt (HOLogic.dest_string s)
|
||||||
|
in (s', HOLogic.dest_string S) end
|
||||||
|
val string_pair_list = map conv' (HOLogic.dest_list term)
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -12,7 +12,7 @@ text\<open>
|
||||||
theory attribute must be set:\<close>
|
theory attribute must be set:\<close>
|
||||||
|
|
||||||
|
|
||||||
declare[[invariants_checking = true]]
|
declare[[invariants_strict_checking = true]]
|
||||||
|
|
||||||
text\<open>For example, let's define the following two classes:\<close>
|
text\<open>For example, let's define the following two classes:\<close>
|
||||||
|
|
||||||
|
@ -124,7 +124,6 @@ update_instance*[introduction2::introduction
|
||||||
|
|
||||||
text*[introduction3::introduction, authored_by = "{@{author \<open>church\<close>}}", level = "Some 2"]\<open>\<close>
|
text*[introduction3::introduction, authored_by = "{@{author \<open>church\<close>}}", level = "Some 2"]\<open>\<close>
|
||||||
text*[introduction4::introduction, authored_by = "{@{author \<open>curry\<close>}}", level = "Some 4"]\<open>\<close>
|
text*[introduction4::introduction, authored_by = "{@{author \<open>curry\<close>}}", level = "Some 4"]\<open>\<close>
|
||||||
|
|
||||||
text*[resultProof2::result, evidence = "proof", property="[@{thm \<open>HOL.sym\<close>}]"]\<open>\<close>
|
text*[resultProof2::result, evidence = "proof", property="[@{thm \<open>HOL.sym\<close>}]"]\<open>\<close>
|
||||||
|
|
||||||
text\<open>Then we can evaluate expressions with instances:\<close>
|
text\<open>Then we can evaluate expressions with instances:\<close>
|
||||||
|
@ -144,6 +143,6 @@ value*\<open>evidence @{result \<open>resultProof\<close>} = evidence @{result \
|
||||||
|
|
||||||
declare[[invariants_checking_with_tactics = false]]
|
declare[[invariants_checking_with_tactics = false]]
|
||||||
|
|
||||||
declare[[invariants_checking = false]]
|
declare[[invariants_strict_checking = false]]
|
||||||
|
|
||||||
end
|
end
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
chapter \<open>Notes on Isabelle/DOF for Isabelle2021-1\<close>
|
chapter \<open>Notes on Isabelle/DOF for Isabelle2022\<close>
|
||||||
|
|
||||||
theory "Isabelle2021-1"
|
theory "Isabelle2022"
|
||||||
imports Main
|
imports Main
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
@ -44,33 +44,9 @@ text \<open>
|
||||||
we see more and more alternatives, e.g. system options or services in
|
we see more and more alternatives, e.g. system options or services in
|
||||||
Isabelle/Scala (see below).
|
Isabelle/Scala (see below).
|
||||||
|
|
||||||
\<^item> \<^file>\<open>$ISABELLE_DOF_HOME/etc/options\<close>
|
\<^item> \<^path>\<open>$ISABELLE_DOF_HOME/etc/options\<close> should not be used for regular
|
||||||
|
Isabelle/DOF applications: thus it works properly within Isabelle/AFP,
|
||||||
\<^item> options declared as \<^verbatim>\<open>public\<close> appear in the Isabelle/jEdit dialog
|
where the component context is missing.
|
||||||
\<^action>\<open>plugin-options\<close> (according to their \<^verbatim>\<open>section\<close>)
|
|
||||||
|
|
||||||
\<^item> all options (public and non-public) are available for command-line
|
|
||||||
usage, e.g. \<^verbatim>\<open>isabelle build -o dof_url="..."\<close>
|
|
||||||
|
|
||||||
\<^item> access to options in Isabelle/ML:
|
|
||||||
|
|
||||||
\<^item> implicit (for the running ML session)
|
|
||||||
\<^ML>\<open>Options.default_string \<^system_option>\<open>dof_url\<close>\<close>
|
|
||||||
|
|
||||||
\<^item> explicit (e.g. for each theories section in
|
|
||||||
\<^file>\<open>$ISABELLE_HOME/src/Pure/Tools/build.ML\<close>):
|
|
||||||
\<^ML>\<open>fn options => Options.string options \<^system_option>\<open>dof_url\<close>\<close>
|
|
||||||
|
|
||||||
\<^item> access in Isabelle/Scala is always explicit; the initial options
|
|
||||||
should be created only once and passed around as explicit argument:
|
|
||||||
|
|
||||||
\<^scala>\<open>{
|
|
||||||
val options = isabelle.Options.init();
|
|
||||||
options.string("dof_url");
|
|
||||||
}\<close>
|
|
||||||
|
|
||||||
Note: there are no antiquotations in Isabelle/Scala, so the literal
|
|
||||||
string \<^scala>\<open>"dof_url"\<close> is unchecked.
|
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
|
|
||||||
|
@ -107,13 +83,8 @@ subsection \<open>Document commands\<close>
|
||||||
|
|
||||||
text \<open>
|
text \<open>
|
||||||
Isar toplevel commands now support a uniform concept for
|
Isar toplevel commands now support a uniform concept for
|
||||||
\<^ML_type>\<open>Toplevel.presentation\<close>, but the exported interfaces are
|
\<^ML_type>\<open>Toplevel.presentation\<close>, e.g. see
|
||||||
limited to commands that do not change the semantic state: see
|
|
||||||
\<^ML>\<open>Toplevel.present\<close> and \<^ML>\<open>Toplevel.present_local_theory\<close>.
|
\<^ML>\<open>Toplevel.present\<close> and \<^ML>\<open>Toplevel.present_local_theory\<close>.
|
||||||
|
|
||||||
Since \<^verbatim>\<open>Toplevel.present_theory\<close> is missing in Isabelle2021-1, we use a
|
|
||||||
workaround with an alternative presentation hook: it exports
|
|
||||||
\<^verbatim>\<open>document/latex_dof\<close> files instead of regular \<^verbatim>\<open>document/latex_dof\<close>.
|
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
|
|
||||||
|
@ -174,10 +145,7 @@ section \<open>Miscellaneous NEWS and Notes\<close>
|
||||||
|
|
||||||
text \<open>
|
text \<open>
|
||||||
\<^item> Document preparation: there are many new options etc. that might help
|
\<^item> Document preparation: there are many new options etc. that might help
|
||||||
to fine-tune DOF output, e.g. \<^system_option>\<open>document_comment_latex\<close>.
|
to fine-tune DOF output, e.g. \<^system_option>\<open>document_heading_prefix\<close>.
|
||||||
|
|
||||||
\<^item> ML: Theory_Data / Generic_Data: "val extend = I" has been removed;
|
|
||||||
obsolete since Isabelle2021.
|
|
||||||
|
|
||||||
\<^item> ML: \<^ML>\<open>Thm.instantiate\<close> and related operations now use explicit
|
\<^item> ML: \<^ML>\<open>Thm.instantiate\<close> and related operations now use explicit
|
||||||
abstract types for the instantiate, see \<^file>\<open>~~/src/Pure/term_items.ML\<close>
|
abstract types for the instantiate, see \<^file>\<open>~~/src/Pure/term_items.ML\<close>
|
||||||
|
@ -196,6 +164,8 @@ text \<open>
|
||||||
example:
|
example:
|
||||||
|
|
||||||
\<^ML>\<open>\<^Type>\<open>nat\<close>\<close>
|
\<^ML>\<open>\<^Type>\<open>nat\<close>\<close>
|
||||||
|
\<^ML>\<open>\<^Type>\<open>prod \<^Type>\<open>int\<close> \<^Type>\<open>int\<close>\<close>\<close>
|
||||||
|
|
||||||
\<^ML>\<open>fn (A, B) => \<^Type>\<open>fun A B\<close>\<close>
|
\<^ML>\<open>fn (A, B) => \<^Type>\<open>fun A B\<close>\<close>
|
||||||
\<^ML>\<open>\<^Type_fn>\<open>fun A B => \<open>(A, B)\<close>\<close>\<close>
|
\<^ML>\<open>\<^Type_fn>\<open>fun A B => \<open>(A, B)\<close>\<close>\<close>
|
||||||
|
|
|
@ -67,7 +67,7 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark
|
||||||
val strg = XML.string_of (hd (Latex.output text))
|
val strg = XML.string_of (hd (Latex.output text))
|
||||||
val file = {path = Path.make [oid ^ "_snippet.tex"],
|
val file = {path = Path.make [oid ^ "_snippet.tex"],
|
||||||
pos = @{here},
|
pos = @{here},
|
||||||
content = strg}
|
content = Bytes.string strg}
|
||||||
|
|
||||||
val _ = Generated_Files.write_file (Path.make ["latex_test"])
|
val _ = Generated_Files.write_file (Path.make ["latex_test"])
|
||||||
file
|
file
|
||||||
|
@ -125,7 +125,7 @@ val _ =
|
||||||
|
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
|
ML\<open>open Bytes\<close>
|
||||||
text\<open>And here a tex - text macro.\<close>
|
text\<open>And here a tex - text macro.\<close>
|
||||||
text\<open>Pythons ReStructuredText (RST).
|
text\<open>Pythons ReStructuredText (RST).
|
||||||
@{url \<open>https://de.wikipedia.org/wiki/ReStructuredText\<close>}. Tool: Sphinx.
|
@{url \<open>https://de.wikipedia.org/wiki/ReStructuredText\<close>}. Tool: Sphinx.
|
||||||
|
@ -352,8 +352,8 @@ Config.get ;
|
||||||
|
|
||||||
(*
|
(*
|
||||||
\begin{figure}[h]
|
\begin{figure}[h]
|
||||||
|
|
||||||
\centering
|
\centering
|
||||||
|
|
||||||
\includegraphics[scale=0.5]{graph_a}
|
\includegraphics[scale=0.5]{graph_a}
|
||||||
\caption{An example graph}
|
\caption{An example graph}
|
||||||
|
|
||||||
|
@ -362,35 +362,37 @@ Config.get ;
|
||||||
|
|
||||||
|
|
||||||
\begin{figure}
|
\begin{figure}
|
||||||
|
|
||||||
\centering
|
\centering
|
||||||
|
|
||||||
\begin{subfigure}[b]{0.3\textwidth}
|
\begin{subfigure}[b]{0.3\textwidth}
|
||||||
|
|
||||||
\centering
|
\centering
|
||||||
\includegraphics[width=\textwidth]{graph1}
|
\includegraphics[width=\textwidth]{graph1}
|
||||||
\caption{$y=x$}
|
\caption{$y=x$}
|
||||||
|
|
||||||
\label{fig:y equals x}
|
\label{fig:y equals x} (* PROBLEM *)
|
||||||
\end{subfigure}
|
\end{subfigure}
|
||||||
|
|
||||||
\hfill
|
\hfill
|
||||||
|
|
||||||
\begin{subfigure}[b]{0.3\textwidth}
|
\begin{subfigure}[b]{0.3\textwidth}
|
||||||
|
|
||||||
\centering
|
\centering
|
||||||
\includegraphics[width=\textwidth]{graph2}
|
\includegraphics[width=\textwidth]{graph2}
|
||||||
\caption{$y=3sinx$}
|
\caption{$y=3sinx$}
|
||||||
|
|
||||||
\label{fig:three sin x}
|
\label{fig:three sin x} (* PROBLEM *)
|
||||||
\end{subfigure}
|
\end{subfigure}
|
||||||
|
|
||||||
\hfill
|
\hfill
|
||||||
|
|
||||||
\begin{subfigure}[b]{0.3\textwidth}
|
\begin{subfigure}[b]{0.3\textwidth}
|
||||||
|
|
||||||
\centering
|
\centering
|
||||||
\includegraphics[width=\textwidth]{graph3}
|
\includegraphics[width=\textwidth]{graph3}
|
||||||
\caption{$y=5/x$}
|
\caption{$y=5/x$}
|
||||||
|
|
||||||
\label{fig:five over x}
|
\label{fig:five over x} (* PROBLEM *)
|
||||||
\end{subfigure}
|
\end{subfigure}
|
||||||
|
|
||||||
\caption{Three simple graphs}
|
\caption{Three simple graphs}
|
||||||
|
@ -436,7 +438,7 @@ fun figure_antiq (check: Proof.context -> Path.T option -> Input.source -> Path.
|
||||||
else error "scale out of range (must be between 1 and 100"
|
else error "scale out of range (must be between 1 and 100"
|
||||||
val fig_args = Library.enclose "[" "]" (commas [wdth_ltx,scl_ltx])
|
val fig_args = Library.enclose "[" "]" (commas [wdth_ltx,scl_ltx])
|
||||||
val _ = writeln cap
|
val _ = writeln cap
|
||||||
fun proportion () = "0."^ (Int.toString (100 div length(!figure_proportions)))
|
fun proportion () = "0."^ (Int.toString (100 div List.length(!figure_proportions)))
|
||||||
\<comment> \<open>naive: assumes equal proportions\<close>
|
\<comment> \<open>naive: assumes equal proportions\<close>
|
||||||
fun core arg = "\n\\centering\n"
|
fun core arg = "\n\\centering\n"
|
||||||
^"\\includegraphics"
|
^"\\includegraphics"
|
||||||
|
@ -502,7 +504,8 @@ Figure*[fff::figure,src="\<open>this is a side-by-side\<close>"]
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
|
|
||||||
Figure*[ffff::figure,(* caption *) src="\<open>this is another 2 side-by-side\<close>"]
|
|
||||||
|
Figure*[ffff::figure2, caption="\<open>this is another 2 side-by-side\<close>"]
|
||||||
\<open>@{figure_content [width=40, scale=35, caption="This is a left test"] "../ROOT"}\<close>
|
\<open>@{figure_content [width=40, scale=35, caption="This is a left test"] "../ROOT"}\<close>
|
||||||
\<open>@{figure_content [width=40, scale=35, caption="This is a right test"] "../ROOT"}\<close>
|
\<open>@{figure_content [width=40, scale=35, caption="This is a right test"] "../ROOT"}\<close>
|
||||||
|
|
||||||
|
@ -510,5 +513,18 @@ Figure*[ffff::figure,(* caption *) src="\<open>this is another 2 side-by-side\<c
|
||||||
text\<open> @{figure "ffff(2)"}\<close>
|
text\<open> @{figure "ffff(2)"}\<close>
|
||||||
*)
|
*)
|
||||||
|
|
||||||
|
Figure*[figxxx::figure2,caption="\<open>Proofs establishing an Invariant Preservation.\<close>"]
|
||||||
|
\<open>@{boxed_theory_text [display]
|
||||||
|
\<open>lemma inv_c2_preserved : "c2_inv \<sigma> \<Longrightarrow> c1_inv (\<sigma> \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e)"
|
||||||
|
unfolding c1_inv_def c2_inv_def
|
||||||
|
Computer_Hardware_to_Hardware_morphism_def
|
||||||
|
Product_to_Component_morphism_def
|
||||||
|
by (auto simp: comp_def)
|
||||||
|
|
||||||
|
lemma Computer_Hardware_to_Hardware_total :
|
||||||
|
"Computer_Hardware_to_Hardware_morphism ` ({X. c2_inv X})
|
||||||
|
\<subseteq> ({X::Hardware. c1_inv X})"
|
||||||
|
using inv_c2_preserved by auto\<close>}\<close>
|
||||||
|
|
||||||
end
|
end
|
||||||
(*>*)
|
(*>*)
|
||||||
|
|
|
@ -7,6 +7,7 @@ session "Isabelle_DOF-tests" = "Isabelle_DOF" +
|
||||||
"TermAntiquotations"
|
"TermAntiquotations"
|
||||||
"Attributes"
|
"Attributes"
|
||||||
"Evaluation"
|
"Evaluation"
|
||||||
"Isabelle2021-1"
|
|
||||||
"High_Level_Syntax_Invariants"
|
"High_Level_Syntax_Invariants"
|
||||||
"Ontology_Matching_Example"
|
"Ontology_Matching_Example"
|
||||||
|
theories [condition = ISABELLE_DOF_HOME]
|
||||||
|
"Isabelle2022"
|
||||||
|
|
Loading…
Reference in New Issue