Compare commits

...

67 Commits

Author SHA1 Message Date
Nicolas Méric 963923be3f Add normalization to update_instance* command 2022-12-12 14:37:27 +01:00
Nicolas Méric 55988f1b43 Normalize docitem value programatically 2022-12-12 14:37:27 +01:00
Nicolas Méric 0b2d28b547 Update error message for invariant checking 2022-12-09 16:11:57 +01:00
Nicolas Méric 37d7ed7d17 Update rails for annotated text element in manual 2022-12-09 15:13:22 +01:00
Nicolas Méric 312734afbd Update Attributes examples 2022-12-09 15:12:38 +01:00
Burkhart Wolff 8cee80d78e advanced example on trace-attribute term-antiquotations 2022-12-07 16:01:38 +01:00
Makarius Wenzel ec0d525426 Tuned messages, following Isabelle/d6a2a8bc40e1 2022-12-05 15:21:26 +01:00
Makarius Wenzel 791990039b Tuned messages and options, following Isabelle/c7f3e94fce7b 2022-12-05 12:37:59 +01:00
Makarius Wenzel 78d61390fe Prefer Isar command, instead of its underlying ML implementation 2022-12-05 11:50:12 +01:00
Makarius Wenzel ffcf1f3240 Add missing file (amending 5471d873a9) 2022-12-04 19:26:28 +01:00
Makarius Wenzel 5471d873a9 Isabelle/Scala module within session context supports document_build = "dof" without component setup 2022-12-04 19:13:08 +01:00
Makarius Wenzel df37250a00 Simplified args, following README.md 2022-12-04 19:00:23 +01:00
Makarius Wenzel 185daeb577 Tuned 2022-12-04 18:25:29 +01:00
Makarius Wenzel 8037fd15f2 Tuned messages, following isabelle.Export.message 2022-12-04 18:20:54 +01:00
Makarius Wenzel afcd78610b More concise export artifact 2022-12-04 18:03:53 +01:00
Makarius Wenzel b8a9ef5118 Tuned comments 2022-12-04 16:38:56 +01:00
Makarius Wenzel a4e75c8b12 Clarified export name for the sake of low-level errors 2022-12-04 16:35:55 +01:00
Makarius Wenzel d20e9ccd22 Proper session qualifier for theory imports (amending 44cae2e631) 2022-12-04 00:45:07 +01:00
Makarius Wenzel f2ee5d3780 Tuned 2022-12-04 00:10:43 +01:00
Makarius Wenzel 44cae2e631 More formal management of ontologies in Isabelle/ML/Isar with output via Isabelle/Scala exports 2022-12-04 00:09:29 +01:00
Makarius Wenzel 7b2bf35353 More strict treatment of document export artifacts 2022-12-03 14:54:14 +01:00
Makarius Wenzel e8c7fa6018 Clarified signature 2022-12-03 14:44:04 +01:00
Makarius Wenzel b12e61511d Discourage etc/options 2022-12-03 13:55:56 +01:00
Makarius Wenzel 3cac42e6cb Clarified order 2022-12-03 12:39:00 +01:00
Makarius Wenzel aee8ba1df1 Prefer DOF parameters over Isabelle options 2022-12-03 12:37:58 +01:00
Makarius Wenzel d93e1383d4 Afford full-scale command-line tool 2022-12-03 12:29:24 +01:00
Makarius Wenzel 3d5d1e7476 Further attempts at woodpecker environment 2022-12-02 22:54:02 +01:00
Makarius Wenzel 4264e7cd15 Build Scala/Java components to get proper ISABELLE_CLASSPATH 2022-12-02 21:40:59 +01:00
Makarius Wenzel 96f4077c53 Tuned message 2022-12-02 21:29:45 +01:00
Makarius Wenzel d7fb39d7eb Adhoc command-line tool replaces old options 2022-12-02 21:14:55 +01:00
Makarius Wenzel b95826962f Tuned documentation 2022-12-02 20:29:40 +01:00
Makarius Wenzel 912d4bb49e Maintain document template in Isabelle/ML via Isar commands:
result becomes export artifact, which is harvested by Isabelle/Scala build engine
2022-12-02 20:05:15 +01:00
Makarius Wenzel a6c1a2baa4 Removed obsolete "extend" operation 2022-12-02 15:31:23 +01:00
Makarius Wenzel bb5963c6e2 Proper usage of dof_mkroot, although its Bash pretty-printing in LaTeX is a bit odd 2022-12-02 14:35:17 +01:00
Makarius Wenzel cc3e2a51a4 More antiquotations 2022-12-02 13:50:16 +01:00
Makarius Wenzel 9e4e5b49eb More antiquotations from Isabelle2021-1/2022 2022-12-02 11:41:31 +01:00
Makarius Wenzel b65ecbdbef Updated to Isabelle2022 2022-12-02 10:34:15 +01:00
Makarius Wenzel 3be2225dcf Tuned comments 2022-12-01 22:54:01 +01:00
Makarius Wenzel f44f0af01c Use regular Toplevel.presentation from Isabelle2022, without alternative presentation hook 2022-12-01 22:48:45 +01:00
Makarius Wenzel 9a11baf840 Latex.output_name name is back in Isabelle2022 2022-12-01 22:04:56 +01:00
Makarius Wenzel 48c167aa23 Proper DOF.artifact_url 2022-12-01 21:45:06 +01:00
Makarius Wenzel 700a9bbfee clarified DOF.options: hard-wired document_comment_latex always uses LaTeX version of comment.sty 2022-12-01 21:30:32 +01:00
Makarius Wenzel 73299941ad Tuned 2022-12-01 17:26:29 +01:00
Makarius Wenzel 5a8c438c41 Omit excessive quotes 2022-12-01 16:48:33 +01:00
Makarius Wenzel 7772c73aaa More accurate defaults 2022-12-01 16:39:41 +01:00
Makarius Wenzel ca18453043 Clarified signature: more explicit types and operations 2022-12-01 16:28:44 +01:00
Makarius Wenzel 1a122b1a87 More robust default 2022-12-01 15:48:52 +01:00
Makarius Wenzel 47d95c467e Tuned whitespace 2022-12-01 15:33:16 +01:00
Makarius Wenzel bf3085d4c0 Clairifed defaults and command-line options 2022-12-01 15:26:48 +01:00
Makarius Wenzel 068e6e0411 Tuned 2022-12-01 14:23:00 +01:00
Makarius Wenzel 09e9980691 Tuned 2022-12-01 14:22:32 +01:00
Makarius Wenzel 94ce3fdec2 Prefer constants in Scala, to make this independent from component context 2022-12-01 14:15:17 +01:00
Makarius Wenzel 44819bff02 Updated message, following c29ec9641a 2022-12-01 12:44:03 +01:00
Makarius Wenzel a6ab1e101e Update Isabelle + AFP URLs 2022-12-01 11:55:51 +01:00
Makarius Wenzel c29ec9641a Simplified installation 2022-12-01 11:45:12 +01:00
Nicolas Méric 06833aa190 Upddate single argument handling for compute_attr_access
Trigger error when the attribute is not specified as an argument
of the antiquatation and is not an attribujte of the instance.
(In these case, the position of the attribute is NONE)
2022-11-28 10:05:47 +01:00
Nicolas Méric 4f0c7e1e95 Fix type unification clash for trace_attribute term antiquotation 2022-11-25 08:57:59 +01:00
Nicolas Méric 0040949cf8 Add trace-attribute term antiquotation
- Make doc_class type and constant used by regular expression
  in monitors ground
- Make class tag attribute ground (with serial())
- The previous items make possible
  the evaluation of the trace attribute
  and the definition of the trace-attribute term annotation
2022-11-24 16:47:21 +01:00
Nicolas Méric e68c332912 Fix markup for some antiquotations
Fix markup for docitem_attribute and trace_attribute
ML and text antiquotations
2022-11-24 11:22:02 +01:00
Burkhart Wolff b2c4f40161 Some LaTeX experiments with Achim 2022-11-18 10:30:33 +01:00
Burkhart Wolff 309952e0ce syntactic rearrangements 2022-11-09 11:19:00 +01:00
Burkhart Wolff 830e1b440a ported another Figure* in OutOfOrderPresntn to Isabelle2022 2022-11-09 06:06:30 +01:00
Burkhart Wolff 2149db9efc semantics of fig_content (untested) 2022-11-08 20:52:58 +01:00
Burkhart Wolff 1547ace64b added some semantics to fig_content 2022-11-08 19:27:07 +01:00
Burkhart Wolff 39acd61dfd Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2022-11-08 10:03:30 +01:00
Burkhart Wolff 29770b17ee added syntax for fig_content 2022-11-08 10:03:15 +01:00
Achim D. Brucker b4f4048cff Made clear that more than two AFP entries are required. 2022-11-07 17:05:04 +00:00
43 changed files with 1910 additions and 527 deletions

View File

@ -7,9 +7,10 @@ pipeline:
- export `isabelle getenv ISABELLE_HOME_USER`
- mkdir -p $ISABELLE_HOME_USER/etc
- echo "ISABELLE_PDFLATEX=\"$LATEX --file-line-error\"" >> $ISABELLE_HOME_USER/etc/settings
- isabelle components -u `pwd`
- isabelle build -D . -o browser_info
- isabelle dof_mkroot DOF_test
- isabelle components -u .
- isabelle build -D . -o browser_info
- isabelle dof_mkroot -q DOF_test
- isabelle build -D DOF_test
- cp -r $ISABELLE_HOME_USER/browser_info $ARTIFACT_DIR
- cd $ARTIFACT_DIR

View File

@ -194,8 +194,8 @@ for i in $VARS; do
export "$i"
done
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL options -g dof_isabelle)"
DOF_VERSION="$($ISABELLE_TOOL options -g dof_version)"
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
DOF_VERSION="$($ISABELLE_TOOL dof_param -b dof_version)"
ISABELLE_SHORT_VERSION=`echo $ISABELLE_VERSION | sed -e 's/:.*$//'`
ISADOF_TAR="Isabelle_DOF-"$DOF_VERSION"_"$ISABELLE_SHORT_VERSION

View File

@ -10,10 +10,10 @@ online.](https://artifacts.logicalhacking.com/releases/Isabelle_DOF/Isabelle_DOF
Isabelle/DOF has three major prerequisites:
* **Isabelle:** Isabelle/DOF requires [Isabelle
2022](http://isabelle.in.tum.de/website-Isabelle2022/). Please download the
2022](https://isabelle.in.tum.de/website-Isabelle2022/). Please download the
Isabelle 2022 distribution for your operating system from the [Isabelle
website](http://isabelle.in.tum.de/website-Isabelle2022/).
* **AFP:** Isabelle/DOF requires two entries from the [Archive of Formal Proofs
website](https://isabelle.in.tum.de/website-Isabelle2022/).
* **AFP:** Isabelle/DOF requires several entries from the [Archive of Formal Proofs
(AFP)](https://www.isa-afp.org/). Please install the AFP following the
instructions given at <https://www.isa-afp.org/using.html>.
* **LaTeX:** Isabelle/DOF requires a modern LaTeX installation, i.e., at least
@ -29,7 +29,7 @@ this command already during the installation of the prerequisites, you can skip
it now):
```console
foo@bar:~$ isabelle components -u `pwd`
foo@bar:~$ isabelle components -u .
```
The final step for the installation is:
@ -91,15 +91,19 @@ templates:
```console
foo@bar:~$ isabelle dof_mkroot -h
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
Options are:
-I init Mercurial repository and add generated files
-h print help
-n NAME alternative session name (default: directory base name)
-o ONTOLOGY ontology (default: scholarly_paper)
-t TEMPLATE tempalte (default: scrartcl)
-o NAMES list of ontologies, separated by blanks
(default: "technical_report scholarly_paper")
-q quiet mode: less verbosity
-t NAME template (default: "scrreprt-modern")
Prepare session root directory (default: current directory).
Create session root directory for Isabelle/DOF (default: current directory).
```
## Releases

View File

@ -4,6 +4,7 @@ no_build = false
requirements = \
env:ISABELLE_SCALA_JAR
sources = \
src/scala/dof.scala \
src/scala/dof_document_build.scala \
src/scala/dof_mkroot.scala \
src/scala/dof_tools.scala

View File

@ -1,30 +0,0 @@
(* :mode=isabelle-options: *)
section "Isabelle/DOF"
public option dof_template : string = "scrreprt-modern"
-- "Default document template for Isabelle/DOF documents"
public option dof_ontologies : string = "Isabelle_DOF.technical_report Isabelle_DOF.scholarly_paper"
-- "Isabelle/DOF ontologies (separated by blanks)"
option dof_version : string = "Unreleased"
-- "Isabelle/DOF version"
(* "Unreleased" for development, semantic version for releases *)
option dof_isabelle : string = "2022"
option dof_afp : string = "afp-2022-10-27"
option dof_latest_version : string = "1.3.0"
option dof_latest_isabelle : string = "Isabelle2021-1"
option dof_latest_doi : string = "10.5281/zenodo.6810799"
option dof_generic_doi : string = "10.5281/zenodo.3370482"
option dof_url : string = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF"
-- "Isabelle/DOF source repository"
option dof_artifact_dir : string = "releases/Isabelle_DOF/Isabelle_DOF"
-- "Isabelle/DOF release artifacts"
option dof_artifact_host : string = "artifacts.logicalhacking.com"

View File

@ -1,5 +1,4 @@
# -*- shell-script -*- :mode=shellscript:
ISABELLE_DOF_HOME="$COMPONENT"
ISABELLE_DOCS="$ISABELLE_DOF_HOME/doc":"$ISABELLE_DOCS"
ISABELLE_DOCS="$ISABELLE_DOF_HOME/doc:$ISABELLE_DOCS"

View File

@ -1,7 +1,5 @@
session "mini_odo" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof,
dof_ontologies = "Isabelle_DOF.technical_report Isabelle_DOF.cenelec_50128",
dof_template = "Isabelle_DOF.scrreprt-modern"]
options [document = pdf, document_output = "output", document_build = dof]
sessions
"Physical_Quantities"
theories

View File

@ -19,6 +19,8 @@ imports
"Isabelle_DOF.technical_report"
"Physical_Quantities.SI" "Physical_Quantities.SI_Pretty"
begin
use_template "scrreprt-modern"
use_ontology technical_report and CENELEC_50128
declare[[strict_monitor_checking=true]]
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
@ -294,7 +296,7 @@ and the global model parameters such as wheel diameter, the number of teeth per
sampling frequency etc., we can infer the maximal time of service as well the maximum distance
the device can measure. As an example configuration, choosing:
\<^item> \<^term>\<open>(1 *\<^sub>Q metre)::real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
\<^item> \<^term>\<open>(1 *\<^sub>Q metre):: real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
\<^item> \<^term>\<open>100 :: real\<close> for \<^term>\<open>tpw\<close> (teeth per wheel),
\<^item> \<^term>\<open>80 *\<^sub>Q kmh :: real[m\<cdot>s\<^sup>-\<^sup>1]\<close> for \<^term>\<open>Speed\<^sub>M\<^sub>a\<^sub>x\<close>,
\<^item> \<^term>\<open>14.4 *\<^sub>Q kHz :: real[s\<^sup>-\<^sup>1]\<close> for the sampling frequency,
@ -654,9 +656,9 @@ text*[t10::test_result]
test-execution via, \<^eg>, a makefile or specific calls to a test-environment or test-engine. \<close>
text
\<open> Finally some examples of references to doc-items, i.e. text-elements
with declared meta-information and status. \<close>
text \<open> Finally some examples of references to doc-items, i.e. text-elements
with declared meta-information and status. \<close>
text \<open> As established by @{test_result (unchecked) \<open>t10\<close>},
@{test_result (define) \<open>t10\<close>} \<close>
text \<open> the @{test_result \<open>t10\<close>}

View File

@ -16,6 +16,9 @@ theory IsaDofApplications
imports "Isabelle_DOF.scholarly_paper"
begin
use_template "lncs"
use_ontology "scholarly_paper"
open_monitor*[this::article]
declare[[strict_monitor_checking=false]]

View File

@ -1,7 +1,5 @@
session "2018-cicm-isabelle_dof-applications" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof,
dof_ontologies = "Isabelle_DOF.scholarly_paper", dof_template = Isabelle_DOF.lncs,
quick_and_dirty = true]
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
theories
IsaDofApplications
document_files

View File

@ -1,6 +1,5 @@
session "2020-iFM-csp" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof,
dof_ontologies = "Isabelle_DOF.scholarly_paper", dof_template = Isabelle_DOF.scrartcl]
options [document = pdf, document_output = "output", document_build = dof]
theories
"paper"
document_files

View File

@ -6870,7 +6870,7 @@ isbn="978-3-540-48509-4"
title = {{Isabelle's} Logic: {HOL}},
author = {Tobias Nipkow and Lawrence C. Paulson and Markus Wenzel},
year = 2009,
misc = {\url{http://isabelle.in.tum.de/library/HOL/}}
misc = {\url{https://isabelle.in.tum.de/library/HOL/}}
}
@InProceedings{ garson.ea:security:2008,
@ -11000,7 +11000,7 @@ isbn="978-1-4471-3182-3"
journal = {Archive of Formal Proofs},
month = apr,
year = 2019,
note = {\url{http://isa-afp.org/entries/HOL-CSP.html}},
note = {\url{https://isa-afp.org/entries/HOL-CSP.html}},
ISSN = {2150-914x},
}

View File

@ -3,6 +3,8 @@ theory "paper"
imports "Isabelle_DOF.scholarly_paper"
begin
use_template "scrartcl"
use_ontology "scholarly_paper"
open_monitor*[this::article]

View File

@ -159,9 +159,6 @@ replaced by built-in document templates.\<close> for users are:
\<^LaTeX>-styles) need to be registered in this file. For details of Isabelle's build system, please
consult the Isabelle System Manual~@{cite "wenzel:system-manual:2020"}. In addition to the
standard features of, this file also contains \<^isadof> specific configurations:
\<^item> \<^boxed_bash>\<open>dof_ontologies\<close> a list of (fully qualified) ontologies, separated by spaces, used
by the project.
\<^item> \<^boxed_bash>\<open>dof_template\<close> the (fully qualified) document template.
\<^item> \<^boxed_bash>\<open>document_build=dof\<close> needs to be present, to tell Isabelle, to use the
Isabelle/DOF backend for the document generation.
\<^item> The file \<^boxed_bash>\<open>preamble.tex\<close>\<^index>\<open>preamble.tex\<close>, which allows users to add additional
@ -186,12 +183,14 @@ Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
Options are:
-I init Mercurial repository and add generated files
-h print help
-n NAME alternative session name (default: directory base name)
-o ONTOLOGY ontology (default: scholarly_paper)
-t TEMPLATE tempalte (default: scrartcl)
Prepare session root directory (default: current directory).\<close>}
-o NAMES list of ontologies, separated by blanks
(default: "technical_report scholarly_paper")
-q quiet mode: less verbosity
-t NAME template (default: "scrreprt-modern")
Create session root directory for Isabelle/DOF (default: current directory).\<close>}
\<close>
section*[writing_doc::technical]\<open>Writing Documents: General Remarks\<close>

View File

@ -337,8 +337,8 @@ is currently only available in the SML API's of the kernel.
( @@{command "text*"} '[' meta_args ']' '\<open>' formal_text '\<close>'
| @@{command "ML*"} ('[' meta_args ']')? '\<open>' SML_code '\<close>'
| @@{command "term*"} ('[' meta_args ']')? '\<open>' HOL_term '\<close>'
| @@{command "value*"} \<newline> ('[' meta_args ']' ('[' evaluator ']')?)? '\<open>' HOL_term '\<close>'
| @@{command "assert*"} \<newline> ('[' meta_args ']' ('[' evaluator ']')?)? '\<open>' HOL_term '\<close>'
| (@@{command "value*"}
| @@{command "assert*"}) \<newline> ('[' meta_args ']' ('[' evaluator ']')?)? '\<open>' HOL_term '\<close>'
)
\<close>
\<^rail>\<open>
@ -1182,9 +1182,9 @@ text\<open>
fun check_result_inv oid {is_monitor:bool} ctxt =
let
val kind =
AttributeAccess.compute_attr_access ctxt "evidence" oid <@>{here} <@>{here}
ISA_core.compute_attr_access ctxt "evidence" oid NONE <@>{here}
val prop =
AttributeAccess.compute_attr_access ctxt "property" oid <@>{here} <@>{here}
ISA_core.compute_attr_access ctxt "property" oid NONE <@>{here}
val tS = HOLogic.dest_list prop
in case kind of
<@>{term "proof"} => if not(null tS) then true
@ -1233,7 +1233,8 @@ value*\<open>filter (\<lambda>\<sigma>. the (text_section.level \<sigma>) > 1)
section*[infrastructure::technical]\<open>Technical Infrastructure\<close>
text\<open>
text\<open>
%FIXME update story concerning "list"
The list of fully supported (\<^ie>, supporting both interactive ontological modeling and
document generation) ontologies and the list of supported document templates can be
obtained by calling \<^boxed_bash>\<open>isabelle dof_mkroot -h\<close> (see \<^technical>\<open>first_project\<close>).
@ -1320,6 +1321,7 @@ text\<open>
text\<open>
Developing a new document template ``\<^boxed_bash>\<open>bar\<close>'' requires the following steps:
\<^item> develop a new \<^LaTeX>-template \<^boxed_bash>\<open>src/document-templates/root-bar.tex\<close>
\<^item> add a suitable \<^theory_text>\<open>define_template\<close> command to theory \<^theory>\<open>Isabelle_DOF.Isa_DOF\<close>.
\<^item> activation of the new document template by executing the \<^boxed_bash>\<open>install\<close> script. You can skip the lengthy
checks for the AFP entries and the installation of the Isabelle patch by using the
\<^boxed_bash>\<open>--skip-afp\<close> option:

View File

@ -53,7 +53,6 @@ text\<open>
\<open>structure Data = Generic_Data
( type T = docobj_tab * docclass_tab * ...
val empty = (initial_docobj_tab, initial_docclass_tab, ...)
val extend = I
fun merge((d1,c1,...),(d2,c2,...)) = (merge_docobj_tab (d1,d2,...),
merge_docclass_tab(c1,c2,...))
);\<close>}

View File

@ -15,6 +15,8 @@
theory "Isabelle_DOF-Manual"
imports "05_Implementation"
begin
use_template "scrreprt-modern"
use_ontology "technical_report" and "CENELEC_50128"
close_monitor*[this]
check_doc_global
text\<open>Resulting trace in \<^verbatim>\<open>doc_item\<close> ''this'': \<close>

View File

@ -1,7 +1,5 @@
session "Isabelle_DOF-Manual" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof,
dof_ontologies = "Isabelle_DOF.technical_report Isabelle_DOF.cenelec_50128", dof_template = "Isabelle_DOF.scrreprt-modern",
quick_and_dirty = true]
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
theories
"Isabelle_DOF-Manual"
document_files

View File

@ -451,7 +451,7 @@
journal = {Archive of Formal Proofs},
month = may,
year = 2010,
note = {\url{http://isa-afp.org/entries/Regular-Sets.html}, Formal
note = {\url{https://isa-afp.org/entries/Regular-Sets.html}, Formal
proof development},
issn = {2150-914x}
}
@ -462,7 +462,7 @@
journal = {Archive of Formal Proofs},
month = mar,
year = 2004,
note = {\url{http://isa-afp.org/entries/Functional-Automata.html},
note = {\url{https://isa-afp.org/entries/Functional-Automata.html},
Formal proof development},
issn = {2150-914x}
}

View File

@ -1,7 +1,5 @@
session "TR_MyCommentedIsabelle" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof,
dof_ontologies = "Isabelle_DOF.technical_report", dof_template = Isabelle_DOF.scrreprt,
quick_and_dirty = true]
options [document = pdf, document_output = "output", document_build = dof, quick_and_dirty = true]
theories
"TR_MyCommentedIsabelle"
document_files

View File

@ -14,9 +14,11 @@
(*<*)
theory TR_MyCommentedIsabelle
imports "Isabelle_DOF.technical_report"
begin
use_template "scrreprt"
use_ontology "technical_report"
define_shortcut* isabelle \<rightleftharpoons> \<open>Isabelle/HOL\<close>
open_monitor*[this::report]
@ -62,7 +64,7 @@ text\<open> \<^vs>\<open>-0.5cm\<close>
Isabelle and Isabelle/HOL, a complementary text to the unfortunately somewhat outdated
"The Isabelle Cookbook" in \<^url>\<open>https://nms.kcl.ac.uk/christian.urban/Cookbook/\<close>.
The present text is also complementary to the current version of
\<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>
\<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>
"The Isabelle/Isar Implementation" by Makarius Wenzel in that it focusses on subjects
not covered there, or presents alternative explanations for which I believe, based on my
experiences with students and Phds, that they are helpful.
@ -364,7 +366,7 @@ subsection*[t213::example]\<open>Mechanism 2 : Extending the Global Context \<op
text\<open>A central mechanism for constructing user-defined data is by the \<^ML_functor>\<open>Generic_Data\<close>-functor.
A plugin needing some data \<^verbatim>\<open>T\<close> and providing it with implementations for an
\<^verbatim>\<open>empty\<close>, and operations \<^verbatim>\<open>merge\<close> and \<^verbatim>\<open>extend\<close>, can construct a lense with operations
\<^verbatim>\<open>empty\<close>, and operation \<^verbatim>\<open>merge\<close>, can construct a lense with operations
\<^verbatim>\<open>get\<close> and \<^verbatim>\<open>put\<close> that attach this data into the generic system context. Rather than using
unsynchronized SML mutable variables, this is the mechanism to introduce component local
data in Isabelle, which allows to manage this data for the necessary backtrack and synchronization
@ -373,14 +375,12 @@ text\<open>A central mechanism for constructing user-defined data is by the \<^M
ML \<open>
datatype X = mt
val init = mt;
val ext = I
fun merge (X,Y) = mt
structure Data = Generic_Data
(
type T = X
val empty = init
val extend = ext
val merge = merge
);
\<close>

View File

@ -151,19 +151,19 @@ fi
ACTUAL_ISABELLE_VERSION=`$ISABELLE_TOOL version`
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL options -g dof_isabelle)"
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
if [ ${ISABELLE_VERSION} = "Isabelle" ];
then
echo "Error: cannot find Isabelle/DOF configuration, please check that you"
echo " registered Isabelle/DOF as an Isabelle component, e.g., using"
echo " isabelle components -u $PWD"
echo " isabelle components -u ."
exit 1
fi
AFP_DATE="$($ISABELLE_TOOL options -g dof_afp)"
AFP_DATE="$($ISABELLE_TOOL dof_param -b afp_version)"
AFP_URL="https://www.isa-afp.org/release/"$AFP_DATE".tar.gz"
echo ""

View File

@ -125,14 +125,14 @@ fun heading_command (name, pos) descr level =
Monitor_Command_Parser.document_command (name, pos) descr
{markdown = false, body = true} (enriched_text_element_cmd level);
val _ = heading_command ("title*", @{here}) "section heading" NONE;
val _ = heading_command ("subtitle*", @{here}) "section heading" NONE;
val _ = heading_command ("chapter*", @{here}) "section heading" (SOME (SOME 0));
val _ = heading_command ("section*", @{here}) "section heading" (SOME (SOME 1));
val _ = heading_command ("subsection*", @{here}) "subsection heading" (SOME (SOME 2));
val _ = heading_command ("subsubsection*", @{here}) "subsubsection heading" (SOME (SOME 3));
val _ = heading_command ("paragraph*", @{here}) "paragraph heading" (SOME (SOME 4));
val _ = heading_command ("subparagraph*", @{here}) "subparagraph heading" (SOME (SOME 5));
val _ = heading_command \<^command_keyword>\<open>title*\<close> "section heading" NONE;
val _ = heading_command \<^command_keyword>\<open>subtitle*\<close> "section heading" NONE;
val _ = heading_command \<^command_keyword>\<open>chapter*\<close> "section heading" (SOME (SOME 0));
val _ = heading_command \<^command_keyword>\<open>section*\<close> "section heading" (SOME (SOME 1));
val _ = heading_command \<^command_keyword>\<open>subsection*\<close> "subsection heading" (SOME (SOME 2));
val _ = heading_command \<^command_keyword>\<open>subsubsection*\<close> "subsubsection heading" (SOME (SOME 3));
val _ = heading_command \<^command_keyword>\<open>paragraph*\<close> "paragraph heading" (SOME (SOME 4));
val _ = heading_command \<^command_keyword>\<open>subparagraph*\<close> "subparagraph heading" (SOME (SOME 5));
end
end
@ -157,6 +157,9 @@ doc_class figure =
placement :: placement
spawn_columns :: bool <= True
doc_class figure2 = figure +
caption :: string
doc_class side_by_side_figure = figure +
anchor :: "string"
@ -235,8 +238,8 @@ ML\<open>
(* Ontological Macro Command Support *)
(* *********************************************************************** *)
val _ = Onto_Macros.heading_command ("figure*", @{here}) "figure" NONE;
val _ = Onto_Macros.heading_command ("side_by_side_figure*", @{here}) "multiple figures" NONE;
val _ = Onto_Macros.heading_command \<^command_keyword>\<open>figure*\<close> "figure" NONE;
val _ = Onto_Macros.heading_command \<^command_keyword>\<open>side_by_side_figure*\<close> "multiple figures" NONE;
\<close>
(*<*)
@ -251,6 +254,99 @@ setup\<open>\<close>
*)
(*>*)
subsubsection\<open>Figure Content\<close>
text\<open>The intermediate development goal is to separate the ontological, top-level construct
\<open>figure*\<close>, which will remain a referentiable, ontological document unit, from the more versatile
\<^emph>\<open>import\<close> of a figure. The hope is that this opens the way for more orthogonality and
abstraction from the LaTeX engine.
\<close>
ML\<open>
type fig_content = {relative_width : int, (* percent of textwidth, default 100 *)
scale : int, (* percent, default 100 *)
caption : Input.source (* default empty *)}
val mt_fig_content = {relative_width = 100,
scale = 100,
caption = Input.empty }: fig_content
(* doof wie 100 m feldweg. *)
fun upd_relative_width key {relative_width,scale,caption } : fig_content =
{relative_width = key,scale = scale,caption = caption}: fig_content
fun upd_scale key {relative_width,scale,caption } : fig_content =
{relative_width = relative_width,scale = key,caption = caption}: fig_content
fun upd_caption key {relative_width,scale,caption} : fig_content =
{relative_width = relative_width,scale = scale,caption= key}: fig_content
val widthN = "width"
val scaleN = "scale"
val captionN = "caption";
fun fig_content_modes (ctxt, toks) =
let val (y, toks') = ((((Scan.optional
(Args.parens
(Parse.list1
( (Args.$$$ widthN |-- Args.$$$ "=" -- Parse.int
>> (fn (_, k) => upd_relative_width k))
|| (Args.$$$ scaleN |-- Args.$$$ "=" -- Parse.int
>> (fn (_, k) => upd_scale k))
|| (Args.$$$ captionN |-- Args.$$$ "=" -- Parse.document_source
>> (fn (_, k) => upd_caption k))
))) [K mt_fig_content])
: (fig_content -> fig_content) list parser)
>> (foldl1 (op #>)))
: (fig_content -> fig_content) parser)
(toks)
in (y, (ctxt, toks')) end
fun document_antiq (check: Proof.context -> Path.T option -> Input.source -> Path.T) =
Args.context -- Scan.lift Parse.path_input >> (fn (ctxt, source) =>
(check ctxt NONE source;
Latex.string (Latex.output_ascii_breakable "/" (Input.string_of source))
|> Latex.macro "isatt"));
fun fig_content_antiquotation name scan =
(Document_Output.antiquotation_raw_embedded name
(scan : ((fig_content -> fig_content) * Input.source) context_parser)
(fn ctxt =>
(fn (cfg_trans,file:Input.source) =>
let val {relative_width,scale,caption} = cfg_trans mt_fig_content
val _ = if relative_width < 0 orelse scale<0 then error("negative parameter.")
else ()
val wdth_s = if relative_width = 100 then ""
else "width="^Real.toString((Real.fromInt relative_width)
/ (Real.fromInt 100))^"\textwidth"
val scale_s= if scale = 100 then ""
else "scale="^Real.toString((Real.fromInt scale) / (Real.fromInt 100))
val arg = enclose "[" "]" (commas [wdth_s,scale_s])
val lab = Document_Output.output_document ctxt {markdown = false} caption
val path = Resources.check_file ctxt NONE file
val _ = writeln("file "^Path.file_name path)
(* ToDo: must be declared source of type png or jpeg or pdf, ... *)
in file
|> (Latex.string o Input.string_of)
|> (XML.enclose ("\\includegraphics"^arg^"{") "}")
|> (fn X => X @ Latex.macro "capture" lab)
end
)
));
val _ = fig_content_antiquotation
: binding
-> ((fig_content -> fig_content) * Input.source) context_parser
-> theory -> theory
val _ = Theory.setup
( fig_content_antiquotation \<^binding>\<open>fig_content\<close>
(fig_content_modes -- Scan.lift(Parse.path_input)))
\<close>
subsection\<open>Tables\<close>
(* TODO ! ! ! *)
(* dito the future monitor: table - block *)
@ -260,7 +356,7 @@ text\<open>Tables are (sub) document-elements represented inside the documentati
language. The used technology is similar to the existing railroad-diagram support
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, Sec. 4.5).
However, tables are not directly based on the ideosynchrasies of Knuth-based language design ---
However, tables are not directly based on the idiosyncrasies of Knuth-based language design ---
However, tables come with a more abstract structure model than conventional typesetting in the
LaTeX tradition. It is based of the following principles:
@ -356,7 +452,7 @@ fun upd_cell_line_width num
cell_line_color = cell_line_color, cell_line_width = cell_line_width@[num] }
: cell_config
(*global default configs *)
val (tab_cell_placing, tab_cell_placing_setup)
= Attrib.config_string \<^binding>\<open>tab_cell_placing\<close> (K "center");
val (tab_cell_height, tab_cell_height_setup)
@ -387,6 +483,8 @@ val _ = Theory.setup( tab_cell_placing_setup
#> tab_cell_line_width_setup
)
(*syntax for local tab specifier *)
val cell_placingN = "cell_placing"
val cell_heightN = "cell_height"
val cell_widthN = "cell_width"
@ -396,7 +494,8 @@ val cell_line_widthN = "cell_line_width"
val placing_scan = Args.$$$ "left" || Args.$$$ "center" || Args.$$$ "right"
val color_scan = Args.$$$ "none" || Args.$$$ "red" || Args.$$$ "green" || Args.$$$ "blue" || Args.$$$ "black"
val color_scan = Args.$$$ "none" || Args.$$$ "red" || Args.$$$ "green"
|| Args.$$$ "blue" || Args.$$$ "black"
(*
@ -516,7 +615,7 @@ val _ = Theory.setup
end
\<close>
text\<open> @{file "../ROOT"} \<close>
define_shortcut* clearpage \<rightleftharpoons> \<open>\clearpage{}\<close>
hf \<rightleftharpoons> \<open>\hfill\<close>
br \<rightleftharpoons> \<open>\break\<close>
@ -526,6 +625,9 @@ declare[[tab_cell_placing="left",tab_cell_height="18.0cm"]]
section\<open>Tests\<close>
(*<*)
text\<open> @{fig_content [display] (scale = 80, width=80, caption=\<open>this is \<^term>\<open>\<sigma>\<^sub>i+2\<close> \<dots>\<close>)
\<open>../../examples/scholarly_paper/2018-cicm-isabelle_dof-applications/document/figures/isabelle-architecture.pdf\<close>}\<close>
text\<open> @{table_inline [display] (cell_placing = center,cell_height =\<open>12.0cm\<close>,
cell_height =\<open>13pt\<close>, cell_width = \<open>12.0cm\<close>,
cell_bgnd_color=black,cell_line_color=red,cell_line_width=\<open>12.0cm\<close>)

File diff suppressed because it is too large Load Diff

View File

@ -178,7 +178,7 @@ local open RegExpChecker in
type automaton = state * ((Int.int -> state -> state) * (state -> bool))
val add_atom = fold_aterms (fn Const(c as(_,Type(@{type_name "rexp"},_)))=> insert (op=) c |_=>I);
val add_atom = fold_aterms (fn Const (c as (_, \<^Type>\<open>rexp _\<close>)) => insert (op=) c | _=> I);
fun alphabet termS = rev(map fst (fold add_atom termS []));
fun ext_alphabet env termS =
let val res = rev(map fst (fold add_atom termS [])) @ env;
@ -187,14 +187,14 @@ local open RegExpChecker in
else ()
in res end;
fun conv (Const(@{const_name "Regular_Exp.rexp.Zero"},_)) _ = Zero
|conv (Const(@{const_name "Regular_Exp.rexp.One"},_)) _ = Onea
|conv (Const(@{const_name "Regular_Exp.rexp.Times"},_) $ X $ Y) env = Times(conv X env, conv Y env)
|conv (Const(@{const_name "Regular_Exp.rexp.Plus"},_) $ X $ Y) env = Plus(conv X env, conv Y env)
|conv (Const(@{const_name "Regular_Exp.rexp.Star"},_) $ X) env = Star(conv X env)
|conv (Const(@{const_name "RegExpInterface.opt"},_) $ X) env = Plus(conv X env, Onea)
|conv (Const(@{const_name "RegExpInterface.rep1"},_) $ X) env = Times(conv X env, Star(conv X env))
|conv (Const (s, Type(@{type_name "rexp"},_))) env =
fun conv \<^Const_>\<open>Regular_Exp.rexp.Zero _\<close> _ = Zero
|conv \<^Const_>\<open>Regular_Exp.rexp.One _\<close> _ = Onea
|conv \<^Const_>\<open>Regular_Exp.rexp.Times _ for X Y\<close> env = Times(conv X env, conv Y env)
|conv \<^Const_>\<open>Regular_Exp.rexp.Plus _ for X Y\<close> env = Plus(conv X env, conv Y env)
|conv \<^Const_>\<open>Regular_Exp.rexp.Star _ for X\<close> env = Star(conv X env)
|conv \<^Const_>\<open>RegExpInterface.opt _ for X\<close> env = Plus(conv X env, Onea)
|conv \<^Const_>\<open>RegExpInterface.rep1 _ for X\<close> env = Times(conv X env, Star(conv X env))
|conv (Const (s, \<^Type>\<open>rexp _\<close>)) env =
let val n = find_index (fn x => x = s) env
val _ = if n<0 then error"conversion error of regexp." else ()
in Atom(n) end
@ -203,7 +203,9 @@ local open RegExpChecker in
val eq_int = {equal = curry(op =) : Int.int -> Int.int -> bool};
val eq_bool_list = {equal = curry(op =) : bool list -> bool list -> bool};
fun rexp_term2da env term = let val rexp = conv term env;
fun rexp_term2da env term = let val _ = writeln ("In rexp_term2da term: " ^ @{make_string} term)
val rexp = conv term env;
val nda = RegExpChecker.rexp2na eq_int rexp;
val da = RegExpChecker.na2da eq_bool_list nda;
in da end;

View File

@ -14,4 +14,6 @@ session "Isabelle_DOF" = "Functional-Automata" +
theories
"DOF/Isa_DOF"
"ontologies/ontologies"
export_classpath

View File

@ -19,8 +19,6 @@
%% you need to download lipics.cls from
%% https://www.dagstuhl.de/en/publications/lipics/instructions-for-authors/
%% and add it manually to the praemble.tex and the ROOT file.
%% Moreover, the option "document_comment_latex=true" needs to be set
%% in the ROOT file.
%%
%% All customization and/or additional packages should be added to the file
%% preamble.tex.

View File

@ -28,6 +28,8 @@ theory CENELEC_50128
imports "Isabelle_DOF.technical_report"
begin
define_ontology "DOF-CENELEC_50128.sty"
(* this is a hack and should go into an own ontology, providing thingsd like:
- Assumption*
- Hypothesis*
@ -1009,14 +1011,15 @@ fun check_sil oid _ ctxt =
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
fun check_sil'' [] = true
| check_sil'' (x::xs) =
let
val (_, doc_oid) = x
val doc_record_value = #value (the (DOF_core.get_object_local doc_oid ctxt'))
val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext
(*val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext*)
val Const _ $ _ $ _ $ _ $ _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = doc_record_value
in
if doc_sil <> monitor_sil
then error(doc_oid
@ -1041,7 +1044,7 @@ fun check_sil_slow oid _ ctxt =
val monitor_sil_typ = (Syntax.read_typ ctxt' monitor_cid) --> @{typ "sil"}
val monitor_sil = Value_Command.value ctxt'
(Const("CENELEC_50128.monitor_SIL.sil", monitor_sil_typ) $ monitor_record_value)
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
fun check_sil' [] = true
| check_sil' (x::xs) =
let
@ -1072,7 +1075,7 @@ fun check_required_documents oid _ ctxt =
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
val {monitor_tab,...} = DOF_core.get_data ctxt'
val {accepted_cids, ...} = the (Symtab.lookup monitor_tab oid)
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
fun check_required_documents' [] = true
| check_required_documents' (cid::cids) =
if exists (fn (doc_cid, _) => equal cid doc_cid) traces
@ -1104,8 +1107,9 @@ val monitor_record_value =
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
val doc_record_value = #value (the (DOF_core.get_object_global
"CenelecClassPatternMatchingTest" thy))
val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext
(*val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext*)
val Const _ $ _ $ _ $ _ $ _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = doc_record_value
\<close>
section\<open> Software Assurance \<close>

View File

@ -28,7 +28,7 @@ doc_class A =
subsection\<open>Excursion: On the semantic consequences of this definition: \<close>
text\<open>This class definition leads an implicit Isabelle/HOL \<^theory_text>\<open>record\<close> definition
(cf. \<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>, chapter 11.6.).
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, chapter 11.6.).
Consequently, \<^theory_text>\<open>doc_class\<close>'es inherit the entire theory-infrastructure from Isabelle records:
\<^enum> there is a HOL-type \<^typ>\<open>A\<close> and its extensible version \<^typ>\<open>'a A_scheme\<close>
\<^enum> there are HOL-terms representing \<^emph>\<open>doc_class instances\<close> with the high-level syntax:
@ -88,7 +88,7 @@ doc_class B =
text\<open>We may even use type-synonyms for class synonyms ...\<close>
type_synonym XX = B
print_doc_items
subsection\<open>Examples of inheritance \<close>
doc_class C = XX +

View File

@ -17,9 +17,11 @@ theory scholarly_paper
imports "Isabelle_DOF.Isa_COL"
keywords "author*" "abstract*"
"Definition*" "Lemma*" "Theorem*" :: document_body
begin
define_ontology "DOF-scholarly_paper.sty"
define_ontology "DOF-scholarly_paper-thm.sty"
text\<open>Scholarly Paper provides a number of standard text - elements for scientific papers.
They were introduced in the following.\<close>
@ -46,12 +48,12 @@ doc_class abstract =
ML\<open>
val _ =
Monitor_Command_Parser.document_command ("abstract*", @{here}) "Textual Definition"
Monitor_Command_Parser.document_command \<^command_keyword>\<open>abstract*\<close> "Textual Definition"
{markdown = true, body = true}
(Onto_Macros.enriched_document_cmd_exp (SOME "abstract") []);
val _ =
Monitor_Command_Parser.document_command ("author*", @{here}) "Textual Definition"
Monitor_Command_Parser.document_command \<^command_keyword>\<open>author*\<close> "Textual Definition"
{markdown = true, body = true}
(Onto_Macros.enriched_document_cmd_exp (SOME "author") []);
\<close>
@ -289,7 +291,7 @@ setup\<open>Theorem_default_class_setup\<close>
ML\<open> local open ODL_Meta_Args_Parser in
val _ =
Monitor_Command_Parser.document_command ("Definition*", @{here}) "Textual Definition"
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Definition*\<close> "Textual Definition"
{markdown = true, body = true}
(fn meta_args => fn thy =>
let
@ -301,7 +303,7 @@ val _ =
end);
val _ =
Monitor_Command_Parser.document_command ("Lemma*", @{here}) "Textual Lemma Outline"
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Lemma*\<close> "Textual Lemma Outline"
{markdown = true, body = true}
(fn meta_args => fn thy =>
let
@ -313,7 +315,7 @@ val _ =
end);
val _ =
Monitor_Command_Parser.document_command ("Theorem*", @{here}) "Textual Theorem Outline"
Monitor_Command_Parser.document_command \<^command_keyword>\<open>Theorem*\<close> "Textual Theorem Outline"
{markdown = true, body = true}
(fn meta_args => fn thy =>
let
@ -453,10 +455,10 @@ fun dest_option _ (Const (@{const_name "None"}, _)) = NONE
in
fun check ctxt cidS mon_id pos =
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos @{here}
fun check ctxt cidS mon_id pos_opt =
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos_opt @{here}
val groups = partition (Context.proof_of ctxt) cidS trace
fun get_level_raw oid = AttributeAccess.compute_attr_access ctxt "level" oid @{here} @{here};
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
fun check_level_hd a = case (get_level (snd a)) of
NONE => error("Invariant violation: leading section" ^ snd a ^
@ -484,7 +486,7 @@ end
setup\<open> let val cidS = ["scholarly_paper.introduction","scholarly_paper.technical",
"scholarly_paper.example", "scholarly_paper.conclusion"];
fun body moni_oid _ ctxt = (Scholarly_paper_trace_invariant.check
ctxt cidS moni_oid @{here};
ctxt cidS moni_oid NONE;
true)
in DOF_core.update_class_invariant "scholarly_paper.article" body end\<close>

View File

@ -65,8 +65,8 @@ doc_class result = technical +
ML\<open>fun check_invariant_invariant oid {is_monitor:bool} ctxt =
let val kind_term = AttributeAccess.compute_attr_access ctxt "kind" oid @{here} @{here}
val property_termS = AttributeAccess.compute_attr_access ctxt "property" oid @{here} @{here}
let val kind_term = ISA_core.compute_attr_access ctxt "kind" oid NONE @{here}
val property_termS = ISA_core.compute_attr_access ctxt "property" oid NONE @{here}
val tS = HOLogic.dest_list property_termS
in case kind_term of
@{term "proof"} => if not(null tS) then true
@ -137,10 +137,10 @@ fun dest_option _ (Const (@{const_name "None"}, _)) = NONE
in
fun check ctxt cidS mon_id pos =
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos @{here}
fun check ctxt cidS mon_id pos_opt =
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos_opt @{here}
val groups = partition (Context.proof_of ctxt) cidS trace
fun get_level_raw oid = AttributeAccess.compute_attr_access ctxt "level" oid @{here} @{here};
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
fun check_level_hd a = case (get_level (snd a)) of
NONE => error("Invariant violation: leading section" ^ snd a ^
@ -165,7 +165,7 @@ end
\<close>
setup\<open> let val cidS = ["small_math.introduction","small_math.technical", "small_math.conclusion"];
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid @{here};
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid NONE;
true)
in DOF_core.update_class_invariant "small_math.article" body end\<close>

View File

@ -17,6 +17,8 @@ theory technical_report
imports "Isabelle_DOF.scholarly_paper"
begin
define_ontology "DOF-technical_report.sty"
(* for reports paper: invariant: level \<ge> -1 *)
section\<open>More Global Text Elements for Reports\<close>

120
src/scala/dof.scala Normal file
View File

@ -0,0 +1,120 @@
/*
* Copyright (c)
* 2021-2022 The University of Exeter.
* 2021-2022 The University of Paris-Saclay.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in
* the documentation and/or other materials provided with the
* distribution.
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
* COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
* ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*
* SPDX-License-Identifier: BSD-2-Clause
*/
/*** constants and parameters for Isabelle/DOF ***/
package isabelle.dof
import isabelle._
object DOF {
/** parameters **/
val isabelle_version = "2022"
val isabelle_url = "https://isabelle.in.tum.de/website-Isabelle2022"
val afp_version = "afp-2022-10-27"
// Isabelle/DOF version: "Unreleased" for development, semantic version for releases
val version = "Unreleased"
val session = "Isabelle_DOF"
val latest_version = "1.3.0"
val latest_isabelle = "Isabelle2021-1"
val latest_doi = "10.5281/zenodo.6810799"
val generic_doi = "10.5281/zenodo.3370482"
// Isabelle/DOF source repository
val url = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF"
// Isabelle/DOF release artifacts
val artifact_dir = "releases/Isabelle_DOF/Isabelle_DOF"
val artifact_host = "artifacts.logicalhacking.com"
val artifact_url: String = "https://" + artifact_host + "/" + artifact_dir
def options(opts: Options): Options = opts + "document_comment_latex"
/** Isabelle tool wrapper **/
sealed case class Parameter(name: String, value: String) {
override def toString: String = name
def print(value_only: Boolean): String =
if (value_only) value else name + "=" + value
}
val parameters: List[Parameter] =
List(
Parameter("isabelle_version", isabelle_version),
Parameter("afp_version", afp_version),
Parameter("dof_version", version)
).sortBy(_.name)
def print_parameters(names: List[String],
all: Boolean = false,
value_only: Boolean = false,
progress: Progress = new Progress
): Unit = {
val bad = names.filter(name => !parameters.exists(_.name == name))
if (bad.nonEmpty) error("Unknown parameter(s): " + commas_quote(bad))
val params = if (all) parameters else parameters.filter(p => names.contains(p.name))
for (p <- params) progress.echo(p.print(value_only))
}
val isabelle_tool = Isabelle_Tool("dof_param", "print Isabelle/DOF parameters",
Scala_Project.here, args =>
{
var all = false
var value_only = false
val getopts = Getopts("""
Usage: isabelle dof_param [OPTIONS] NAMES
Options are:
-a print all parameters
-b print values only (default: NAME=VALUE)
Print given Isabelle/DOF parameters, with names from the list:
""" + commas_quote(parameters.map(_.toString)),
"a" -> (_ => all = true),
"b" -> (_ => value_only = true))
val names = getopts(args)
if (names.isEmpty && !all) getopts.usage()
print_parameters(names, all = all, value_only = value_only, progress = new Console_Progress)
})
}

View File

@ -1,7 +1,7 @@
/*
* Copyright (c)
* 2021-2022 The University of Exeter.
* 2021-2022 The University of Paris-Saclay.
* Copyright (c)
* 2021-2022 The University of Exeter.
* 2021-2022 The University of Paris-Saclay.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
@ -28,6 +28,8 @@
* SPDX-License-Identifier: BSD-2-Clause
*/
/*** document build engine for Isabelle/DOF ***/
package isabelle.dof
import isabelle._
@ -37,61 +39,61 @@ object DOF_Document_Build
{
class Engine extends Document_Build.Bash_Engine("dof")
{
// override def use_build_script: Boolean = true
def the_document_entry(context: Document_Build.Context, name: String): Export.Entry = {
val entries =
for {
node_name <- context.document_theories
entry <- context.session_context.get(node_name.theory, name)
} yield entry
entries match {
case List(entry) => entry
case Nil =>
error("Missing export " + quote(name) + " for document theories of session " +
quote(context.session))
case dups =>
error("Multiple exports " + quote(name) + " for theories " +
commas_quote(dups.map(_.theory_name).sorted.distinct))
}
}
override def prepare_directory(
context: Document_Build.Context,
dir: Path,
doc: Document_Build.Document_Variant): Document_Build.Directory =
{
val regex = """^.*\.""".r
val latex_output = new Latex_Output(context.options)
val options = DOF.options(context.options)
val latex_output = new Latex_Output(options)
val directory = context.prepare_directory(dir, doc, latex_output)
// produced by alternative presentation hook (workaround for missing Toplevel.present_theory)
for {
name <- context.document_theories.iterator
entry <- context.session_context.get(name.theory, Export.DOCUMENT_LATEX + "_dof")
} {
val path = Path.basic(Document_Build.tex_name(name))
val xml = YXML.parse_body(entry.text)
File.content(path, xml).output(latex_output(_, file_pos = path.implode_symbolic))
.write(directory.doc_dir)
}
val isabelle_dof_dir = context.session_context.sessions_structure("Isabelle_DOF").dir
// print(context.options.string("dof_url"));
// copy Isabelle/DOF LaTeX templates
val template_dir = isabelle_dof_dir + Path.explode("document-templates")
// TODO: error handling in case 1) template does not exist or 2) root.tex does already exist
val template = regex.replaceAllIn(context.options.string("dof_template"),"")
Isabelle_System.copy_file(template_dir + Path.explode("root-"+template+".tex"),
directory.doc_dir+Path.explode("root.tex"))
val isabelle_dof_dir = context.session_context.sessions_structure(DOF.session).dir
// copy Isabelle/DOF LaTeX styles
// LaTeX styles from Isabelle/DOF directory
List(Path.explode("DOF/latex"), Path.explode("ontologies"))
.flatMap(dir =>
File.find_files((isabelle_dof_dir + dir).file,
file => file.getName.endsWith(".sty"), include_dirs = true))
.flatMap(dir => File.find_files((isabelle_dof_dir + dir).file, _.getName.endsWith(".sty")))
.foreach(sty => Isabelle_System.copy_file(sty, directory.doc_dir.file))
// create ontology.sty
val ltx_styles = context.options.string("dof_ontologies").split(" +").map(s => regex.replaceAllIn(s,""))
File.write(directory.doc_dir+Path.explode("ontologies.tex"),
ltx_styles.mkString("\\usepackage{DOF-","}\n\\usepackage{DOF-","}\n"))
// ontologies.tex from session exports
File.write(directory.doc_dir + Path.explode("ontologies.tex"),
split_lines(the_document_entry(context, "dof/use_ontology").text)
.map(name => "\\usepackage{DOF-" + name + "}\n").mkString)
// create dof-config.sty
File.write(directory.doc_dir+Path.explode("dof-config.sty"), """
\newcommand{\isabelleurl}{https://isabelle.in.tum.de/website-Isabelle2022/""" + context.options.string("dof_isabelle") + """}
\newcommand{\dofurl}{""" + context.options.string("dof_url") + """}
\newcommand{\dof@isabelleversion}{""" + context.options.string("dof_isabelle") + """}
\newcommand{\isabellefullversion}{""" + context.options.string("dof_isabelle") + """\xspace}
\newcommand{\dof@version}{""" + context.options.string("dof_version") + """}
\newcommand{\dof@artifacturl}{""" + context.options.string("dof_artifact_dir") + """}
\newcommand{\doflatestversion}{""" + context.options.string("dof_latest_version") + """}
\newcommand{\isadoflatestdoi}{""" + context.options.string("dof_latest_doi") + """}
\newcommand{\isadofgenericdoi}{""" + context.options.string("dof_generic_doi") + """}
\newcommand{\isabellelatestversion}{""" + context.options.string("dof_latest_isabelle") + """}
// root.tex from session exports
File.write(directory.doc_dir + Path.explode("root.tex"),
the_document_entry(context, "dof/use_template").text)
// dof-config.sty
File.write(directory.doc_dir + Path.explode("dof-config.sty"), """
\newcommand{\isabelleurl}{""" + DOF.isabelle_url + """}
\newcommand{\dofurl}{""" + DOF.url + """}
\newcommand{\dof@isabelleversion}{""" + DOF.isabelle_version + """}
\newcommand{\isabellefullversion}{""" + DOF.isabelle_version + """\xspace}
\newcommand{\dof@version}{""" + DOF.version + """}
\newcommand{\dof@artifacturl}{""" + DOF.artifact_url + """}
\newcommand{\doflatestversion}{""" + DOF.latest_version + """}
\newcommand{\isadoflatestdoi}{""" + DOF.latest_doi + """}
\newcommand{\isadofgenericdoi}{""" + DOF.generic_doi + """}
\newcommand{\isabellelatestversion}{""" + DOF.latest_isabelle + """}
""")
directory
}

View File

@ -1,7 +1,7 @@
/*
* Copyright (c)
* 2021-2022 The University of Exeter.
* 2021-2022 The University of Paris-Saclay.
* Copyright (c)
* 2021-2022 The University of Exeter.
* 2021-2022 The University of Paris-Saclay.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
@ -29,32 +29,32 @@
*/
/* Author: Makarius
Prepare session root directory for Isabelle/DOF.
*/
/*** create session root directory for Isabelle/DOF ***/
package isabelle.dof
import isabelle._
object DOF_Mkroot
{
/** mkroot **/
val default_ontologies: List[String] = List("technical_report", "scholarly_paper")
val default_template = "scrreprt-modern"
def mkroot(
session_name: String = "",
session_dir: Path = Path.current,
session_parent: String = "",
init_repos: Boolean = false,
template: String = "",
ontologies: List[String] = List(),
ontologies: List[String] = default_ontologies,
template: String = default_template,
quiet: Boolean = false,
progress: Progress = new Progress): Unit =
{
Isabelle_System.make_directory(session_dir)
val name = proper_string(session_name) getOrElse session_dir.absolute_file.getName
val parent = proper_string(session_parent) getOrElse Isabelle_System.getenv("ISABELLE_LOGIC")
val root_path = session_dir + Sessions.ROOT
if (root_path.file.exists) error("Cannot overwrite existing " + root_path)
@ -62,47 +62,54 @@ object DOF_Mkroot
val document_path = session_dir + Path.explode("document")
if (document_path.file.exists) error("Cannot overwrite existing " + document_path)
progress.echo("\nPreparing session " + quote(name) + " in " + session_dir)
progress.echo(
(if (quiet) "" else "\n") +
"Initializing session " + quote(name) + " in " + session_dir.absolute)
/* ROOT */
progress.echo(" creating " + root_path)
progress.echo_if(!quiet, " creating " + root_path)
File.write(root_path,
"session " + Mkroot.root_name(name) + " = " + Mkroot.root_name(parent) + """ +
options [document = pdf, document_output = "output", document_build = dof, dof_ontologies = """"
+ ontologies.mkString(" ") + """", dof_template = """ + Mkroot.root_name(template)
+ """, document_comment_latex=true]
"session " + Mkroot.root_name(name) + " = " + Mkroot.root_name(DOF.session) + """ +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
""" + Mkroot.root_name(name) + """
""" + Mkroot.root_name(name) + """
document_files
"preamble.tex"
""")
val thy = session_dir + Path.explode(name+".thy")
progress.echo(" creating " + thy)
val thy = session_dir + Path.explode(name + ".thy")
progress.echo_if(!quiet, " creating " + thy)
File.write(thy,
"theory\n " + name + "\nimports\n " + ontologies.mkString("\n ") + """
"theory\n " + name +
"\nimports\n " + ontologies.map("Isabelle_DOF." + _).mkString("\n ") + """
begin
use_template """ + quote(template) + """
use_ontology """ + ontologies.map(quote).mkString(" and ") + """
end
""")
/* preamble */
val preamble_tex = session_dir + Path.explode("document/preamble.tex")
progress.echo(" creating " + preamble_tex)
progress.echo_if(!quiet, " creating " + preamble_tex)
Isabelle_System.make_directory(preamble_tex.dir)
File.write(preamble_tex,"""%% This is a placeholder for user-specific configuration and packages.""")
/* Mercurial repository */
if (init_repos) {
progress.echo(" \nInitializing Mercurial repository " + session_dir)
progress.echo(
(if (quiet) "" else "\n") + "Initializing Mercurial repository " + session_dir.absolute)
val hg = Mercurial.init_repository(session_dir)
@ -130,7 +137,7 @@ syntax: regexp
{
val print_dir = session_dir.implode
progress.echo("""
progress.echo_if(!quiet, """
Now use the following command line to build the session:
isabelle build -D """ +
@ -139,42 +146,41 @@ Now use the following command line to build the session:
}
/** Isabelle tool wrapper **/
val isabelle_tool = Isabelle_Tool("dof_mkroot", "prepare session root directory for Isabelle/DOF",
val isabelle_tool = Isabelle_Tool("dof_mkroot", "create session root directory for Isabelle/DOF",
Scala_Project.here, args =>
{
var init_repos = false
var help = false
var session_name = ""
var session_parent = "Isabelle_DOF"
var ontologies:List[String] = List()
var template = session_parent + ".scrartcl"
val default_ontologies = List(session_parent+".scholarly_paper")
var ontologies = default_ontologies
var template = default_template
var quiet = false
val getopts = Getopts("""
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
Options are:
-I init Mercurial repository and add generated files
-h print help
-n NAME alternative session name (default: directory base name)
-o ONTOLOGY ontology (default: scholarly_paper)
-t TEMPLATE tempalte (default: scrartcl)
-o NAMES list of ontologies, separated by blanks
(default: """ + quote(Word.implode(default_ontologies)) + """)
-q quiet mode: less verbosity
-t NAME template (default: """ + quote(default_template) + """)
Prepare session root directory (default: current directory).
Create session root directory for Isabelle/DOF (default: current directory).
""",
"I" -> (arg => init_repos = true),
"I" -> (_ => init_repos = true),
"h" -> (_ => help = true),
"n:" -> (arg => session_name = arg),
"o:" -> (arg => ontologies = ontologies :+ arg),
"t:" -> (arg => template = arg),
"h" -> (arg => help = true)
)
"o:" -> (arg => ontologies = Word.explode(arg)),
"q" -> (_ => quiet = true),
"t:" -> (arg => template = arg))
val more_args = getopts(args)
ontologies = if (ontologies.isEmpty) default_ontologies else ontologies
if (help) {getopts.usage()} else {()}
val session_dir =
more_args match {
case Nil => Path.current
@ -182,7 +188,12 @@ Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
case _ => getopts.usage()
}
mkroot(session_parent=session_parent, session_name = session_name, session_dir = session_dir, init_repos = init_repos,
ontologies = ontologies, template = template, progress = new Console_Progress)
if (help) getopts.usage()
val progress = new Console_Progress
mkroot(session_name = session_name, session_dir = session_dir,
init_repos = init_repos, quiet = quiet, progress = progress,
ontologies = ontologies, template = template)
})
}

View File

@ -28,6 +28,8 @@
* SPDX-License-Identifier: BSD-2-Clause
*/
/*** command-line tools for Isabelle/DOF ***/
package isabelle.dof
@ -35,5 +37,6 @@ import isabelle._
class DOF_Tools extends Isabelle_Scala_Tools(
DOF_Mkroot.isabelle_tool
DOF.isabelle_tool,
DOF_Mkroot.isabelle_tool,
)

View File

@ -37,13 +37,13 @@ ML\<open>
find_theorems (60) name:"Conceptual.M."
value [simp]"trace(M.make undefined [] ())"
value "ok(M.make undefined_AAA [] ())"
value "trace(M.make undefined_AAA [] ())"
value "tag_attribute(M.make undefined_AAA [] ())"
value [simp]"M.trace(M.make undefined [] ())"
value "M.ok(M.make undefined_AAA [] ())"
value "M.trace(M.make undefined_AAA [] ())"
value "M.tag_attribute(M.make undefined_AAA [] ())"
value "ok(M.make 0 [] ())"
value "M.ok(M.make 0 [] ())"
(*
value "ok(M.make undefined [] ())"
value "ok(M.make 0 [] undefined)"
@ -168,7 +168,7 @@ update_instance*[omega::E, y+="[''en'']"]
ML\<open> val s = map HOLogic.dest_string (HOLogic.dest_list @{docitem_attribute y::omega}); \<close>
subsection\<open> Example text antiquotation:\<close>
text\<open> @{docitem_attribute omega::y} \<close>
text\<open> @{docitem_attribute y::omega} \<close>
section\<open>Simulation of a Monitor\<close>
@ -193,6 +193,23 @@ ML \<open>@{trace_attribute figs1}\<close>
text\<open>Resulting trace of figs as text antiquotation:\<close>
text\<open>@{trace_attribute figs1}\<close>
text\<open>Test trace_attribute term antiquotation:\<close>
term*\<open>map snd @{trace-attribute \<open>figs1\<close>}\<close>
value*\<open>map snd @{trace-attribute \<open>figs1\<close>}\<close>
term*\<open>map fst @{trace-attribute \<open>aaa\<close>}\<close>
value*\<open>map fst @{trace-attribute \<open>aaa\<close>}\<close>
value*\<open>(map fst @{trace-attribute \<open>aaa\<close>}) \<close>
definition example_expression where "example_expression \<equiv> \<lbrace>\<lfloor>''Conceptual.A''\<rfloor> || \<lfloor>''Conceptual.F''\<rfloor>\<rbrace>\<^sup>*"
value* \<open> DA.accepts (na2da (rexp2na example_expression)) (map fst @{trace-attribute \<open>aaa\<close>}) \<close>
definition word_test :: "'a list \<Rightarrow> 'a rexp \<Rightarrow> bool" (infix "is-in" 60)
where " w is-in rexp \<equiv> DA.accepts (na2da (rexp2na rexp)) (w)"
value* \<open> (map fst @{trace-attribute \<open>aaa\<close>}) is-in example_expression \<close>
(*<*)
text\<open>Final Status:\<close>
print_doc_items

View File

@ -44,7 +44,7 @@ subsection*[b::A, x = "5"] \<open> Lorem ipsum dolor sit amet, ... \<close>
text\<open>Setting a sample invariant, referring to attribute value "x":\<close>
ML\<open>fun check_A_invariant oid {is_monitor:bool} ctxt =
let val term = AttributeAccess.compute_attr_access ctxt "x" oid @{here} @{here}
let val term = ISA_core.compute_attr_access ctxt "x" oid NONE @{here}
val (@{typ "int"},x_value) = HOLogic.dest_number term
in if x_value > 5 then error("class A invariant violation") else true end
\<close>
@ -80,8 +80,11 @@ to take sub-classing into account:
\<close>
ML\<open>fun check_M_invariant oid {is_monitor} ctxt =
let val term = AttributeAccess.compute_attr_access ctxt "trace" oid @{here} @{here}
fun conv (Const(@{const_name "Pair"},_) $ Const(s,_) $ S) = (s, HOLogic.dest_string S)
let val term = ISA_core.compute_attr_access ctxt "trace" oid NONE @{here}
fun conv (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
let val s' = DOF_core.read_cid (Context.proof_of ctxt) (HOLogic.dest_string s)
in (s', HOLogic.dest_string S) end
val string_pair_list = map conv (HOLogic.dest_list term)
val cid_list = map fst string_pair_list
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
@ -113,10 +116,15 @@ subsection*[f::E] \<open> Lectus accumsan velit ultrices, ... }\<
section*[f2::E] \<open> Lectus accumsan velit ultrices, ... }\<close>
*)
ML\<open>val term = AttributeAccess.compute_attr_access
(Context.Proof @{context}) "trace" "struct" @{here} @{here} ;
ML\<open>val ctxt = @{context}
val term = ISA_core.compute_attr_access
(Context.Proof ctxt) "trace" "struct" NONE @{here} ;
fun conv (Const(@{const_name "Pair"},_) $ Const(s,_) $ S) = (s, HOLogic.dest_string S)
val string_pair_list = map conv (HOLogic.dest_list term)
fun conv' (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
let val s' = DOF_core.read_cid ctxt (HOLogic.dest_string s)
in (s', HOLogic.dest_string S) end
val string_pair_list = map conv' (HOLogic.dest_list term)
\<close>

View File

@ -11,16 +11,70 @@ text\<open>
To enable the checking of the invariants, the \<open>invariants_checking\<close>
theory attribute must be set:\<close>
declare[[invariants_checking = true]]
text\<open>For example, let's define the following two classes:\<close>
ML\<open>
val update_name = "High_Level_Syntax_Invariants.class_inv1.int1_update"
val term' = Const ("Groups.one_class.one", \<^typ>\<open>int\<close>)
val update_bname = Long_Name.base_name update_name
val attr_name = String.extract (update_bname, 0, SOME (String.size(update_bname) - String.size(Record.updateN)))
fun update_class_term_list attr_name term attr =
let
val (attr_id, ty, term_list) = attr
in
if (Binding.name_of attr_id = attr_name)
then
let
fun update_term_list [] term = [term]
| update_term_list (x::xs) term = x::term::xs
(*| update_term_list (x::y::xs) term = x::(update_term_list xs term)@[y]*)
in
(attr_id, ty, update_term_list term_list term) end
else attr
end
val class_list = [("High_Level_Syntax_Invariants.class_inv1",
[(Binding.name "int1", "int",
[Free ("High_Level_Syntax_Invariants_class_inv2_int1_Attribute_Not_Initialized", \<^typ>\<open>int\<close>),
Const ("Num.numeral_class.numeral", \<^typ>\<open>num \<Rightarrow> int\<close>)
$ (Const ("Num.num.Bit1", \<^typ>\<open>num \<Rightarrow> num\<close>)
$ Const ("Num.num.One", \<^typ>\<open>num\<close>))])]),
("High_Level_Syntax_Invariants.class_inv2",
[(Binding.name "int2", "int",
[Free ("High_Level_Syntax_Invariants_class_inv2_int2_Attribute_Not_Initialized", \<^typ>\<open>int\<close>),
Const ("Groups.one_class.one", \<^typ>\<open>int\<close>)])])]
val class_list' = map
(fn class =>
let val (cid, attr_list) = class
in (cid, map (update_class_term_list attr_name term') attr_list) end) class_list
\<close>
doc_class test_lattice =
test_lattice_attr :: "int set"
text*[test_lattice1::test_lattice, test_lattice_attr="{2}"]\<open>\<close>
print_doc_items
update_instance*[test_lattice1::test_lattice, test_lattice_attr+="{4}"]
print_doc_items
doc_class test_string =
test_string_attr:: "string"
text*[test_string1::test_string, test_string_attr="''abc''"]\<open>\<close>
update_instance*[test_string1::test_string, test_string_attr +="''def''"]
print_doc_items
doc_class class_inv1 =
int1 :: "int"
int1 :: "int" <= "0"
invariant inv1 :: "int1 \<sigma> \<ge> 3"
doc_class class_inv0 = class_inv1 +
int1 :: "int"
text*[testinv0::class_inv0, int1=4]\<open>lorem ipsum...\<close>
print_doc_items
doc_class class_inv2 = class_inv1 +
int1 :: "int" <= "1"
int2 :: "int"
invariant inv2 :: "int2 \<sigma> < 2"
@ -40,6 +94,9 @@ update_instance*[testinv1::class_inv1, int1:=1]
*)
text*[testinv2::class_inv2, int1=3, int2=1]\<open>lorem ipsum...\<close>
print_doc_items
value*\<open>@{class-inv2 \<open>testinv2\<close>}\<close>
value*\<open>int1 @{class-inv2 \<open>testinv2\<close>}\<close>
text\<open>
The value of each attribute defined for the instances is checked against their classes invariants.
@ -146,4 +203,799 @@ declare[[invariants_checking_with_tactics = false]]
declare[[invariants_checking = false]]
text*[testtest0::class_inv1]\<open>lorem ipsum...\<close>
text*[testtest::class_inv1, int1=2]\<open>lorem ipsum...\<close>
text*[testtest1::class_inv1, int1=4]\<open>lorem ipsum...\<close>
update_instance*[testtest::class_inv1, int1+=1]
update_instance*[testtest::class_inv1, int1:=3]
text*[testtest2::class_inv2]\<open>lorem ipsum...\<close>
update_instance*[testtest2::class_inv2, int2+=1]
text*[testtest00::class_inv0]\<open>lorem ipsum...\<close>
ML\<open>
val thy = @{theory}
val cid = "class_inv0"
val t = DOF_core.parse_cid_global thy cid
\<close>
print_doc_classes
print_doc_items
find_consts name:"class_inv1"
find_theorems name:"class_inv1*simps"
find_theorems name:"class_inv1*defs*"
find_theorems name:"class_inv1*ext*"
record point =
xcoord :: int
ycoord :: int
record pointt = point +
xcoord :: int
ycoord :: int
find_consts name:"xcoord"
term\<open>pointt.make\<close>
definition pt1 :: point where "pt1 \<equiv> \<lparr>point.xcoord = 1 , ycoord = 2 \<rparr>"
definition pt2 :: point where "pt2 \<equiv> \<lparr>point.xcoord = 2 , ycoord = 4 \<rparr>"
record cpoint = point +
color :: int
term\<open>\<lparr>xcoord=1, ycoord=3\<rparr>\<close>
term\<open>\<lparr>color = 3, xcoord=1, ycoord=3\<rparr>\<close>
definition cpt1 :: cpoint where "cpt1 \<equiv> \<lparr>point.xcoord=1, ycoord=3, color = 3\<rparr>"
doc_class class_inv4 =
pt_attr :: point <= "pt1"
text*[testtest4::class_inv4, pt_attr="pt2"]\<open>lorem ipsum...\<close>
find_consts name:"point"
find_theorems name:"point*simps"
find_theorems name:"point*defs*"
find_theorems name:"cpoint*ext*"
doc_class class_inv3 =
class_inv1_attr :: "class_inv1 set" <= "{@{class-inv1 \<open>testtest1\<close>}}"
pt :: point
text*[testtest3::class_inv3
, class_inv1_attr="{@{class-inv1 \<open>testtest\<close>}, @{class-inv1 \<open>testtest1\<close>}}"
, pt="\<lparr>point.xcoord = 1 , ycoord = 2 \<rparr>\<lparr>point.xcoord := 2\<rparr>"
]\<open>lorem ipsum...\<close>
doc_class class_inv5 =
a1 :: "class_inv1 set"
b1 :: point
text*[testtest5::class_inv5]\<open>lorem ipsum...\<close>
doc_class class_inv6 = class_inv5 +
a1 :: "class_inv1 set"
b1 :: point
text*[testtest6::class_inv6]\<open>lorem ipsum...\<close>
doc_class class_inv7 = class_inv6 +
a1 :: "class_inv1 set"
b1 :: point
text*[testtest7::class_inv7]\<open>lorem ipsum...\<close>
print_doc_classes
doc_class class_inv8 = class_inv7 +
a1 :: "class_inv1 set"
b1 :: point
c1 :: int
text*[testtest8::class_inv8]\<open>lorem ipsum...\<close>
ML\<open>
val pt1 = @{term "pt1"}
(*val thy = @{theory}*)
val {input_term,...} = the (DOF_core.get_object_global "testtest" @{theory})
(*val {input_term,...} = the (DOF_core.get_object_global "introduction2" @{theory})*)
(*val {input_term,...} = the (DOF_core.get_object_global "testtest3" @{theory})*)
(*val {input_term,...} = the (DOF_core.get_object_global "testtest4" @{theory})*)
val update $ abs $ record = input_term
val make $ tag $ attr = record
\<close>
term\<open>(1::int) + aaaa::int\<close>
term \<open>class_inv1.make\<close>
typ\<open>'a class_inv1_scheme\<close>
find_consts name:"class_inv1"
find_theorems name:"class_inv1"
find_theorems name:"class_inv3"
ML\<open>
val thy = @{theory}
val oid = "testtest1"
val {input_term,value,cid=class_name,...} = the (DOF_core.get_object_global oid thy)
val _ = writeln(Syntax.string_of_term_global thy input_term)
val _ = writeln(Syntax.string_of_term_global thy value)
val purified_class_name = String.translate (fn #"." => "_" | x => String.implode [x]) class_name
fun attr_to_free (binding, typ, _) = Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)
val class_list = DOF_core.get_attributes class_name thy
fun attrs_filter [] = []
| attrs_filter (x::xs) =
let val (cid, ys) = x
fun is_duplicated _ [] = false
| is_duplicated y (x::xs) =
let val (_, ys) = x
in if exists (map_eq_fst_triple Binding.name_of y) ys
then true
else is_duplicated y xs end
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
val class_list' = rev (attrs_filter (rev class_list))
val update_name = "High_Level_Syntax_Invariants.class_inv1.int1_update"
val bname = Long_Name.base_name update_name
val attr_name = String.extract (bname, 0, SOME (String.size(bname) - String.size(Record.updateN)))
val term = Const ("Groups.zero_class.zero", @{typ "int"})
val term = Bound 0
fun normalize attr_name term attr =
let
val (attr_id, ty, term_option) = attr
in
if (Binding.name_of attr_id = attr_name)
then
(attr_id, ty, SOME term)
else attr
end
val class_list_test = map
(fn class =>
let val (cid, attr_list) = class
in (cid, map (normalize attr_name term) attr_list) end) class_list'
val qual = Long_Name.qualifier update_name
\<close>
ML\<open>
val thy = @{theory}
val oid = "testtest2"
val {input_term,cid=class_name,...} = the (DOF_core.get_object_global oid thy)
val _ = writeln(Syntax.string_of_term_global thy input_term)
val t = DOF_core.is_subclass_global thy "High_Level_Syntax_Invariants.class_inv2" "High_Level_Syntax_Invariants.class_inv1"
\<close>
ML\<open>
val thy = @{theory}
val oid = "testtest2"
val {input_term,cid=class_name,...} = the (DOF_core.get_object_global oid thy)
val _ = writeln(Syntax.string_of_term_global thy input_term)
val (Const(update_name,typ) $ Abs(s, typ', term') $ term) = input_term
val Type(s, [t, u]) = typ
val T = Syntax.read_typ_global thy class_name
val T' = T --> T
val test = u = T'
\<close>
ML\<open>
val thy = @{theory}
val oid = "testtest2"
val oid = "testinv2"
val {input_term,cid=class_name,...} = the (DOF_core.get_object_global oid thy)
val _ = writeln(Syntax.string_of_term_global thy input_term)
val class_list = DOF_core.get_attributes class_name thy
fun attrs_filter [] = []
| attrs_filter (x::xs) =
let val (cid, ys) = x
fun is_duplicated _ [] = false
| is_duplicated y (x::xs) =
let val (_, ys) = x
in if exists (map_eq_fst_triple Binding.name_of y) ys
then true
else is_duplicated y xs end
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
val class_list' = rev (attrs_filter (rev class_list))
val purified_class_name = String.translate (fn #"." => "_" | x => String.implode [x]) class_name
val class_list'' = map (fn (string, attrs) =>
(string , map (fn (binding, typ, term_option) =>
case term_option of
NONE => let
val term = Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)
in (binding, typ, [term]) end
| SOME (term) => (binding, typ, [term])) attrs))
class_list'
val (Const(update_name,typ) $ Abs(s, typ', term') $ term) = input_term
val update_bname = Long_Name.base_name update_name
val attr_name = String.extract (update_bname, 0, SOME (String.size(update_bname) - String.size(Record.updateN)))
val update_qual = Long_Name.qualifier update_name
val (Const(make_name, typ''), ts)= strip_comb term
val make_bname = Long_Name.base_name make_name
val make_qual = Long_Name.qualifier make_name
fun update_class_term_list attr_name term attr =
let
val (attr_id, ty, term_list) = attr
in
if (Binding.name_of attr_id = attr_name)
then
(attr_id, ty, term_list @ [term])
else attr
end
val class_list''' = map
(fn class =>
let val (cid, attr_list) = class
in (cid, map (update_class_term_list attr_name term') attr_list) end) class_list''
val (Const(update_name,typ) $ Abs(s, typ', term') $ term) = term
val (Const(make_name, typ''), ts)= strip_comb term
val make_bname' = Long_Name.base_name make_name
val make_qual' = Long_Name.qualifier make_name
(*val (Const(make_name, typ''), ts)= strip_comb term
val make_bname = Long_Name.base_name make_name
val make_qual = Long_Name.qualifier make_name
*)
fun normalize_term [term] = term
| normalize_term (x::y::xs) =
case y of
\<^Const>\<open>Groups.plus_class.plus \<^typ>\<open>int\<close>\<close> $ Bound 0 $ t =>
(*let val (T, i) = HOLogic.dest_number x
val (T',j) = HOLogic.dest_number t*)
let val evaluated_term = Value_Command.value @{context} (\<^Const>\<open>Groups.plus_class.plus \<^typ>\<open>int\<close>\<close> $ x $ t)
(*in normalize_term ((HOLogic.mk_number T (i + j))::xs) end*)
in normalize_term (evaluated_term::xs) end
(*| \<^Const>\<open>Lattices.sup_class.sup \<^Type>\<open>set \<^typ>\<open>_\<close>\<close>\<close> $ Bound 0 $ t =>
Value_Command.value*)
| _ => normalize_term (y::xs)
val tag_attr = HOLogic.mk_number @{typ "int"}
fun add_tag_to_attrs tag_attr thy (cid, filtered_attr_list) =
if DOF_core.is_virtual cid thy
then (tag_attr (serial ()))::map (fn (binding, ty, term) => term) filtered_attr_list
else map (fn (binding, ty, term) => term) filtered_attr_list
fun update_class_term_list attr_name term attr =
let
val (attr_id, ty, term_list) = attr
in
if (Binding.name_of attr_id = attr_name)
then
(attr_id, ty, term_list @ [term])
else attr
end
(*val class_list4 = map
(fn class =>
let val (cid, attr_list) = class
in (cid, map (update_class_term_list attr_name term') attr_list) end) class_list'''
val _ = writeln ("In normalize_record update class_list': " ^ @{make_string} class_list4)*)
val class_list5 = map
(fn (string, attrs) =>
(string, map (fn (binding, ty, ts) =>
(binding, ty, normalize_term ts)) attrs)) class_list'''
val class_list6 = flat (map (add_tag_to_attrs tag_attr thy) class_list5)
\<close>
print_doc_items
ML\<open>
val thy = @{theory}
val oid = "testtest3"
val oid = "testinv2"
val {input_term,cid=class_name,...} = the (DOF_core.get_object_global oid thy)
val _ = writeln(Syntax.string_of_term_global thy input_term)
(*val a $ b $ c = input_term*)
(*val d $ e $ f = c*)
(*val test_striped = strip_comb f*)
(*val u $ t = f*)
val purified_class_name = String.translate (fn #"." => "_" | x => String.implode [x]) class_name
fun attr_to_free (binding, typ, _) = Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)
val class_list = DOF_core.get_attributes class_name thy
fun attrs_filter [] = []
| attrs_filter (x::xs) =
let val (cid, ys) = x
fun is_duplicated _ [] = false
| is_duplicated y (x::xs) =
let val (_, ys) = x
in if exists (map_eq_fst_triple Binding.name_of y) ys
then true
else is_duplicated y xs end
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
val class_list' = rev (attrs_filter (rev class_list))
val class_list'' = map (fn (string, attrs) =>
(string , map (fn (binding, typ, term_option) =>
case term_option of
NONE => let
val term = Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)
in (binding, typ, [term]) end
| SOME (term) => (binding, typ, [term])) attrs))
class_list'
(*val class_list_test' = map (fn (s, [(b, ty, toption)]) => (s, [(b, ty, [toption])])) class_list'*)
fun normalize_record (Const(update_name, Type(s, [t, u])) $ Abs(_, typ', term') $ term) class_list cid thy =
let
val _ = writeln ("AAAAAAAAAAAAA")
(* I also must check the class to rule out updates of standard record
see testtest3 *)
val update_bname = Long_Name.base_name update_name
val attr_name = String.extract (update_bname, 0, SOME (String.size(update_bname) - String.size(Record.updateN)))
val update_qual = Long_Name.qualifier update_name
val class_typ = Syntax.read_typ_global thy class_name
val class_typ' = class_typ --> class_typ
(*val (Const(make_name, typ''), ts)= strip_comb term
val make_bname = Long_Name.base_name make_name
val make_qual = Long_Name.qualifier make_name*)
in
if (String.isSuffix Record.updateN update_bname andalso
(update_qual = cid orelse
(DOF_core.is_subclass_global thy cid update_qual andalso class_typ' = u))
(*andalso make_bname = makeN andalso make_qual = cid*))
then
let
val (Const(make_name, typ''), ts)= strip_comb term
val make_bname = Long_Name.base_name make_name
val make_qual = Long_Name.qualifier make_name
in
if (make_bname = makeN andalso make_qual = cid)
then
let
fun normalize_attribute [term] = term
| normalize_attribute (x::y::xs) =
case y of
\<^Const>\<open>Groups.plus_class.plus \<^typ>\<open>int\<close>\<close> $ Bound 0 $ t =>
(*let val (T, i) = HOLogic.dest_number x
val (T',j) = HOLogic.dest_number t*)
let val evaluated_term = Value_Command.value @{context} (\<^Const>\<open>Groups.plus_class.plus \<^typ>\<open>int\<close>\<close> $ x $ t)
(*in normalize_attribute ((HOLogic.mk_number T (i + j))::xs) end*)
in normalize_attribute (evaluated_term::xs) end
(*| \<^Const>\<open>Lattices.sup_class.sup \<^Type>\<open>set \<^typ>\<open>_\<close>\<close>\<close> $ Bound 0 $ t =>
Value_Command.value*)
| _ => normalize_attribute (y::xs)
val tag_attr = HOLogic.mk_number @{typ "int"}
fun add_tag_to_attrs tag_attr thy (cid, filtered_attr_list) =
if DOF_core.is_virtual cid thy
then (tag_attr (serial ()))::map (fn (binding, ty, term) => term) filtered_attr_list
else map (fn (binding, ty, term) => term) filtered_attr_list
val class_list'' = map
(fn (string, attrs) =>
(string, map (fn (binding, ty, ts) =>
(binding, ty, normalize_attribute ts)) attrs)) class_list
val class_list''' = flat (map (add_tag_to_attrs tag_attr thy) class_list'')
(*val make_const = Syntax.read_term_global thy (Long_Name.qualify cid makeN);*)
in list_comb (Const(make_name, typ''), (tag_attr (serial()))::class_list''') end
(*in error ("BBBBBBB") end*)
else
let
fun update_class_term_list attr_name term attr =
let
val (attr_id, ty, term_list) = attr
in
if (Binding.name_of attr_id = attr_name)
then
(attr_id, ty, term_list @ [term])
else attr
end
val class_list' = map
(fn class =>
let val (cid, attr_list) = class
in (cid, map (update_class_term_list attr_name term') attr_list) end) class_list
val _ = writeln ("In normalize_record update class_list': " ^ @{make_string} class_list')
in normalize_record term class_list' cid thy end
end
(*in
normalize_record term class_list' cid thy
(*(Const(update_name,typ) $ abs) :: normalize_record term cid updates thy*)
end*)
else normalize_record term class_list cid thy
end
| normalize_record (t1 $ t2) class_list cid thy =
(normalize_record t1 class_list cid thy) $ (normalize_record t2 class_list cid thy)
| normalize_record t class_list cid thy = t
val record = normalize_record input_term class_list'' class_name thy
val _ = writeln(Syntax.string_of_term_global thy record)
(*val class_int'' = normalize_record input_term class_list_test' class_name thy*)
\<close>
term\<open>class_inv1.make 5798756 4\<close>
value\<open>class_inv1.make 5798756 4\<close>
term\<open>class_inv2.make 5813388 1 (High_Level_Syntax_Invariants_class_inv2_int2_Attribute_Not_Initialized + 1)\<close>
value\<open>class_inv2.make 5813388 1 (High_Level_Syntax_Invariants_class_inv2_int2_Attribute_Not_Initialized + 1)\<close>
term*\<open>class_inv3.make
5862176
{@{class-inv1 ''testtest''}, @{class-inv1 ''testtest1''}}
(\<lparr>point.xcoord = 1, ycoord = 2\<rparr>\<lparr>point.xcoord := 2\<rparr>)\<close>
value*\<open>class_inv3.make
5862176
{@{class-inv1 ''testtest''}, @{class-inv1 ''testtest1''}}
(\<lparr>point.xcoord = 1, ycoord = 2\<rparr>\<lparr>point.xcoord := 2\<rparr>)\<close>
value\<open>List.coset [(1::int), 2]\<close>
value\<open>- set [(1::int), 2]\<close>
ML\<open>
val t = \<^const_name>\<open>Lattices.sup\<close>
(*val t = Type(\<^type_name>\<open>set\<close>, _)*)
val tt = \<^Type>\<open>set \<^typ>\<open>'a\<close>\<close>
(*val ttt = tttt as Type(@{type_name "set"},_)*)
\<close>
find_consts name:"Groups.plus_class.plus"
find_consts name:"Nil"
find_theorems name:"ISA_docitem"
find_theorems name:"ISA_thm"
value\<open>Cons (1::int) Nil\<close>
doc_class lattice_test =
attr_lattice :: "int set" <= "{1}"
text*[lattice_test_instance::lattice_test, attr_lattice ="{2}"]\<open>\<close>
update_instance*[lattice_test_instance::lattice_test, attr_lattice +="{3}"]
value\<open>sup (1::int) 2\<close>
value\<open>sup (sup (sup {1::int} {2}) {3}) {4}\<close>
value\<open>sup (sup {1::int} {2}) {3}\<close>
value\<open>sup {1::int} {2}\<close>
value\<open>inf (inf {1::int} {2}) {3}\<close>
value*\<open>attr_lattice @{lattice-test \<open>lattice_test_instance\<close>}\<close>
ML\<open>
val t = \<^term>\<open>(3::int) + 4\<close>
(*val tt = @{value "3 + 4"}*)
\<close>
ML\<open>
val t = HOLogic.mk_number \<^typ>\<open>int\<close> 12
val tt = HOLogic.dest_number t
\<close>
ML\<open>
val t = \<^Const>\<open>Groups.plus_class.plus \<^typ>\<open>int\<close>\<close>
\<close>
ML\<open>
val t = \<^Const>\<open>Isa_DOF.ISA_docitem \<^Type>\<open>nat\<close>\<close>
\<close>
ML\<open>
val t = \<^Const>\<open>Nil \<^typ>\<open>int\<close>\<close>
val tt = \<^Const>\<open>Cons \<^Type>\<open>int\<close>\<close>
\<close>
ML\<open>
fn (A, B) => \<^Const>\<open>conj for A B\<close>
\<close>
ML\<open>
val T1 = \<^typ>\<open>int \<Rightarrow> int \<Rightarrow> int\<close>
val T2 = \<^typ>\<open>int list\<close>
val t = Const (\<^const_name>\<open>Groups.plus\<close>, T1)
(*val tt = \<^const>\<open>Nil\<close> $ \<^typ>\<open>int list\<close>*)
val tt = \<^const>\<open>Isa_DOF.ISA_thm\<close> $ (HOLogic.mk_string "test")
(*val nil = \<^const>\<open>Nil\<close> $ \<^typ>\<open>int =>\<close>*)
val term = \<^term>\<open>Cons\<close>
val nil' = Const ("List.list.Nil", T2);
(*val ts = \<^const>\<open>Cons\<close> $ @{term "1::int"} $ \<^const>\<open>Nil\<close>*)
val t = @{typ "int \<Rightarrow> int"}
(*val tt = \<^const>\<open>Groups.plus_class.plus\<close> $ \<^typ>\<open>int \<Rightarrow> int \<Rightarrow> int\<close>*)
\<close>
ML\<open>
(*val thy = @{theory}*)
(*val t = [@{typ "class_inv1 set"} --> @{typ "class_inv1 set"}, @{typ "class_inv3"}] ---> @{typ "class_inv3"}*)
(*val t = @{typ "(class_inv1 set \<Rightarrow> class_inv1 set) \<Rightarrow> class_inv3 \<Rightarrow> class_inv3"}*)
(*val Type(n, [t]) = @{typ "(class_inv1 set \<Rightarrow> class_inv1 set) \<Rightarrow> class_inv3 \<Rightarrow> class_inv3"}*)
(*val Type(n1,[t1,t2]) = t*)
(*val cid = Long_Name.qualifier n1*)
(*val testesees = DOF_core.is_defined_cid_global cid thy*)
val thy = @{theory}
fun attrs_filter [] = []
| attrs_filter (x::xs) =
let val (cid, ys) = x
fun is_duplicated _ [] = false
| is_duplicated y (x::xs) =
let val (_, ys) = x
in if exists (map_eq_fst_triple Binding.name_of y) ys
then true
else is_duplicated y xs end
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
val oid = "testtest3"
val {input_term,cid,...} = the (DOF_core.get_object_global "testtest0" @{theory})
(*val (Const(t, u)) $ v = input_term*)
val Const(s, typ) $ abs $ t = input_term
val Const(name, typ) = fst (strip_comb t)
(*val test = Free ("High_Level_Syntax_Invariants_class_inv1_int1_Attribute_Not_Initialized", @{typ int})*)
(*val test' = strip_comb test*)
(*val u $ v $ w = t*)
(*val {input_term,...} = the (DOF_core.get_object_global "introduction2" @{theory})*)
(*val {input_term,cid,...} = the (DOF_core.get_object_global oid @{theory})*)
(*val {input_term,...} = the (DOF_core.get_object_global "testtest4" @{theory})*)
(*val input_term = @{term "1::int"}*)
val _ = writeln(Syntax.string_of_term_global thy input_term)
val elaborated_term = DOF_core.transduce_term_global {mk_elaboration=true} (input_term,\<^here>) thy
val _ = writeln(Syntax.string_of_term_global thy elaborated_term)
fun test_normalize_record (Const(update_name,typ) $ Abs(s, typ', term') $ term) cid updates thy =
let
val _ = writeln ("AAAAAAAAAAAAA")
(* I also must check the class to rule out updates of standard record
see testtest3 *)
val bname = Long_Name.base_name update_name
val qual = Long_Name.qualifier update_name
in
if (String.isSuffix Record.updateN bname andalso qual = cid)
then
let
val _ = writeln(Syntax.string_of_term_global thy term)
val Const(make_name, typ) $ t = term
val bname = Long_Name.base_name make_name
val qual = Long_Name.qualifier make_name
in
(*test_normalize_record term cid (Const(update_name,typ) $ Abs(s, typ', term')::updates) thy*)
if (bname = makeN andalso qual = cid)
then term
else term
end
(*(Const(update_name,typ) $ abs) :: normalize_record' term cid updates thy*)
else
(*test_normalize_record term cid updates thy*)
term
end
| test_normalize_record t cid updates thy = t
(*val (Const(update_name,typ) $ Abs(s, typ', term') $ term) = input_term*)
val updated_term = test_normalize_record input_term "class_inv1" "" thy
(*val a $ b = input_term*)
(*val (Const(nameeee,typeeee) $ termeeee) = input_term*)
(*val (Const(isa_class_nameeee,typeeee) $ abseee $ termeeee) = input_term*)
(*val a' $ b' $ c' = elaborated_term*)
(*val Const(updatee,typee) $ absee $ termee = input_term*)
(*val Const(updateee,typeee) $ abseee $ termeee = termee*)
(*val bnamee = Long_Name.base_name updatee*)
(*val long_cid = Long_Name.qualifier updatee*)
(*val class_list' = DOF_core.get_attributes long_cid thy*)
val class_list' = DOF_core.get_attributes cid thy
val class_list'' = rev (attrs_filter (rev class_list'))
fun normalize_record
(Const(name,typ) $ term) cid updates thy =
if DOF_core.is_class_ISA thy name
then
let val instance_name = HOLogic.dest_string term
val input_term = #input_term (the (DOF_core.get_object_global instance_name thy))
val elaborated_term = DOF_core.transduce_term_global {mk_elaboration=true} (input_term, \<^here>) thy
val bname = Long_Name.base_name name
val qual = Long_Name.qualifier name
val class_name = DOF_core.get_class_name_without_prefix bname
val long_cid = Long_Name.qualify qual class_name
in normalize_record elaborated_term long_cid [] thy end
else Const(name,typ) $ normalize_record term cid updates thy
| normalize_record (Const(update_name,typ) $ Abs(s, typ', term') $ term) cid updates thy = let
val _ = writeln ("AAAAAAAAAAAAA")
(* I also must check the class to rule out updates of standrard record
see testtest3 *)
val bname = Long_Name.base_name update_name
val qual = Long_Name.qualifier update_name
in
if (String.isSuffix Record.updateN bname andalso qual = cid)
then
normalize_record term cid (Const(update_name,typ) $ Abs(s, typ', term')::updates) thy
(*(Const(update_name,typ) $ abs) :: normalize_record term cid updates thy*)
else normalize_record term cid updates thy
end
| normalize_record (Const(name,typ)) cid updates thy =
let
val bname = Long_Name.base_name name
val qual = Long_Name.qualifier name
in if (bname = makeN andalso qual = cid)
then
let
val purified_class_name = String.translate (fn #"." => "_" | x => String.implode [x]) cid
val make_const = Syntax.read_term_global thy name
fun attr_to_free (binding, typ, term_option) updates =
case term_option of
NONE => ()
| SOME (term) => ()
(*Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)*)
val class_list' = DOF_core.get_attributes cid thy
fun attrs_filter [] = []
| attrs_filter (x::xs) =
let val (cid, ys) = x
fun is_duplicated _ [] = false
| is_duplicated y (x::xs) =
let val (_, ys) = x
in if exists (map_eq_fst_triple Binding.name_of y) ys
then true
else is_duplicated y xs end
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
val class_list'' = rev (attrs_filter (rev class_list'))
fun tag_attr class_name = HOLogic.mk_string (class_name ^ "_tag_attribute")
fun add_tag_to_attrs_free' tag_attr thy (cid, filtered_attr_list) =
let val _ = writeln ("In create_default_object cid: " ^ cid)
val _ = writeln ("In create_default_object filtered_attr_list: " ^ @{make_string} filtered_attr_list)
in
if DOF_core.is_virtual cid thy
then (tag_attr cid)::(map (attr_to_free) filtered_attr_list)
(*then (map (attr_to_free) filtered_attr_list)*)
else (map (attr_to_free) filtered_attr_list)
end
(*fun attr_to_free (binding, typ, _) = Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)
*)
in
Const(name,typ) $ normalize_record elaborated_term cid updates thy
end
else Const(name,typ) end
| normalize_record (t $ t') cid updates thy =
normalize_record t cid updates thy $ normalize_record t' cid updates thy
| normalize_record (Abs(s, ty, t)) cid updates thy = Abs (s, ty, normalize_record t cid updates thy)
| normalize_record t cid updates thy = t
fun normalize_record'
(Const(name,typ) $ term) cid updates thy =
if DOF_core.is_class_ISA thy name
then
let val instance_name = HOLogic.dest_string term
val input_term = #input_term (the (DOF_core.get_object_global instance_name thy))
val elaborated_term = DOF_core.transduce_term_global {mk_elaboration=true} (input_term, \<^here>) thy
val bname = Long_Name.base_name name
val qual = Long_Name.qualifier name
val class_name = DOF_core.get_class_name_without_prefix bname
val long_cid = Long_Name.qualify qual class_name
in normalize_record' elaborated_term long_cid [] thy end
else Const(name,typ) $ normalize_record' term cid updates thy
| normalize_record' (Const(update_name,typ) $ Abs(s, typ', term') $ term) cid updates thy = let
val _ = writeln ("AAAAAAAAAAAAA")
(* I also must check the class to rule out updates of standrard record
see testtest3 *)
val bname = Long_Name.base_name update_name
val qual = Long_Name.qualifier update_name
in
if (String.isSuffix Record.updateN bname andalso qual = cid)
then
normalize_record' term cid (Const(update_name,typ) $ Abs(s, typ', term')::updates) thy
(*(Const(update_name,typ) $ abs) :: normalize_record' term cid updates thy*)
else normalize_record' term cid updates thy
end
| normalize_record' (Const(name,typ)) cid updates thy =
let
val bname = Long_Name.base_name name
val qual = Long_Name.qualifier name
in if (bname = makeN andalso qual = cid)
then
let
val purified_class_name = String.translate (fn #"." => "_" | x => String.implode [x]) cid
val make_const = Syntax.read_term_global thy name
fun attr_to_free (binding, typ, term_option) updates =
case term_option of
NONE => ()
| SOME (term) => ()
(*Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)*)
val class_list' = DOF_core.get_attributes cid thy
fun attrs_filter [] = []
| attrs_filter (x::xs) =
let val (cid, ys) = x
fun is_duplicated _ [] = false
| is_duplicated y (x::xs) =
let val (_, ys) = x
in if exists (map_eq_fst_triple Binding.name_of y) ys
then true
else is_duplicated y xs end
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
val class_list'' = rev (attrs_filter (rev class_list'))
fun tag_attr class_name = HOLogic.mk_string (class_name ^ "_tag_attribute")
fun add_tag_to_attrs_free' tag_attr thy (cid, filtered_attr_list) =
let val _ = writeln ("In create_default_object cid: " ^ cid)
val _ = writeln ("In create_default_object filtered_attr_list: " ^ @{make_string} filtered_attr_list)
in
if DOF_core.is_virtual cid thy
then (tag_attr cid)::(map (attr_to_free) filtered_attr_list)
(*then (map (attr_to_free) filtered_attr_list)*)
else (map (attr_to_free) filtered_attr_list)
end
(*fun attr_to_free (binding, typ, _) = Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)
*)
in
Const(name,typ) $ normalize_record' elaborated_term cid updates thy
end
else Const(name,typ) end
| normalize_record' (t $ t') cid updates thy =
normalize_record' t cid updates thy $ normalize_record' t' cid updates thy
| normalize_record' (Abs(s, ty, t)) cid updates thy = Abs (s, ty, normalize_record' t cid updates thy)
| normalize_record' t cid updates thy = t
(*val update_list = normalize_record input_term cid [] thy*)
(*val update_list' = normalize_record elaborated_term*)
(*val temp = map (Syntax.string_of_term @{context}) update_list*)
(*val make_attribute_list = ()*)
(*val Const(make,typ) $ absee $ termee = input_term*)
(*val make_attributes = ()*)
(*val Const(update,typ) $ abs $ record = input_term*)
(*val bname = Long_Name.base_name update;*)
(*if String.isSubstring Record.updateN bname then () else ();*)
(*val class_name = Long_Name.qualifier update*)
(*val {simps,...} = the (Record.get_info thy class_name)*)
(*val attr_update_*)
\<close>
ML\<open>
open SHA1;
val class_name = "High_Level_Syntax_Invariants.class_inv5"
val sha = SHA1.digest class_name
val ttt = rep sha
\<close>
ML\<open>
val testtt = @{typ "int"}
val input_term =
Const ("High_Level_Syntax_Invariants.class_inv1.make", @{typ "int \<Rightarrow> int \<Rightarrow> class_inv1"})
$ Const ("Num.num.One", @{typ "num"})
$ Free ("High_Level_Syntax_Invariants_class_inv1_int1_Attribute_Not_Initialized", @{typ"int"})
val class_name = "High_Level_Syntax_Invariants.class_inv1"
val sha = SHA1.digest class_name
val purified_class_name = String.translate (fn #"." => "_" | x => String.implode [x]) class_name
fun attr_to_free (binding, typ, _) = Free (purified_class_name ^ "_"
^ (Binding.name_of binding)
^ "_Attribute_Not_Initialized", typ)
val class_list' = DOF_core.get_attributes class_name thy
fun attrs_filter [] = []
| attrs_filter (x::xs) =
let val (cid, ys) = x
fun is_duplicated _ [] = false
| is_duplicated y (x::xs) =
let val (_, ys) = x
in if exists (map_eq_fst_triple Binding.name_of y) ys
then true
else is_duplicated y xs end
in (cid, filter_out (fn y => is_duplicated y xs) ys)::attrs_filter xs end
val class_list'' = rev (attrs_filter (rev class_list'))
val tag_attr = HOLogic.mk_number @{typ "int"}
fun add_tag_to_attrs_free' tag_attr thy (cid, filtered_attr_list) =
if DOF_core.is_virtual cid thy
then (tag_attr (serial ()))::(map (attr_to_free) filtered_attr_list)
else (map (attr_to_free) filtered_attr_list)
val class_list'''' = flat (map (add_tag_to_attrs_free' tag_attr thy) class_list'')
\<close>
ML\<open>
val Type (s, [t]) = @{typ "class_inv7"}
val Type (s', [t']) = t
val Type (s'', [t'']) = t'
\<close>
value\<open> (1::nat) \<le> (2::nat)\<close>
typ\<open>'a High_Level_Syntax_Invariants.class_inv7.class_inv7_ext\<close>
value\<open>High_Level_Syntax_Invariants.class_inv7.make\<close>
ML\<open>
val t = I
\<close>
typedecl inda
axiomatization Zeroa_Rep :: inda and Suca_Rep :: "inda \<Rightarrow> inda"
\<comment> \<open>The axiom of infinity in 2 parts:\<close>
where Suca_Rep_inject: "Suca_Rep x = Suca_Rep y \<Longrightarrow> x = y"
and Suca_Rep_not_Zero_Rep: "Suca_Rep x \<noteq> Zeroa_Rep"
inductive Nata :: "inda \<Rightarrow> bool"
where
Zero_RepI : "Nata Zeroa_Rep"
| Suc_RepI: "Nata i \<Longrightarrow> Nata (Suca_Rep i)"
typedef nataa = "{n. Nata n}"
morphisms Rep_Nata Abs_Nata
using [[simp_trace=true]]
using Nata.Zero_RepI by auto
ML\<open>
open Term;
\<close>
print_doc_items
find_theorems (60) name:"author"
find_consts name:"author"
end

View File

@ -1,6 +1,6 @@
chapter \<open>Notes on Isabelle/DOF for Isabelle2021-1\<close>
chapter \<open>Notes on Isabelle/DOF for Isabelle2022\<close>
theory "Isabelle2021-1"
theory "Isabelle2022"
imports Main
begin
@ -44,33 +44,9 @@ text \<open>
we see more and more alternatives, e.g. system options or services in
Isabelle/Scala (see below).
\<^item> \<^file>\<open>$ISABELLE_DOF_HOME/etc/options\<close>
\<^item> options declared as \<^verbatim>\<open>public\<close> appear in the Isabelle/jEdit dialog
\<^action>\<open>plugin-options\<close> (according to their \<^verbatim>\<open>section\<close>)
\<^item> all options (public and non-public) are available for command-line
usage, e.g. \<^verbatim>\<open>isabelle build -o dof_url="..."\<close>
\<^item> access to options in Isabelle/ML:
\<^item> implicit (for the running ML session)
\<^ML>\<open>Options.default_string \<^system_option>\<open>dof_url\<close>\<close>
\<^item> explicit (e.g. for each theories section in
\<^file>\<open>$ISABELLE_HOME/src/Pure/Tools/build.ML\<close>):
\<^ML>\<open>fn options => Options.string options \<^system_option>\<open>dof_url\<close>\<close>
\<^item> access in Isabelle/Scala is always explicit; the initial options
should be created only once and passed around as explicit argument:
\<^scala>\<open>{
val options = isabelle.Options.init();
options.string("dof_url");
}\<close>
Note: there are no antiquotations in Isabelle/Scala, so the literal
string \<^scala>\<open>"dof_url"\<close> is unchecked.
\<^item> \<^path>\<open>$ISABELLE_DOF_HOME/etc/options\<close> should not be used for regular
Isabelle/DOF applications: thus it works properly within Isabelle/AFP,
where the component context is missing.
\<close>
@ -107,13 +83,8 @@ subsection \<open>Document commands\<close>
text \<open>
Isar toplevel commands now support a uniform concept for
\<^ML_type>\<open>Toplevel.presentation\<close>, but the exported interfaces are
limited to commands that do not change the semantic state: see
\<^ML_type>\<open>Toplevel.presentation\<close>, e.g. see
\<^ML>\<open>Toplevel.present\<close> and \<^ML>\<open>Toplevel.present_local_theory\<close>.
Since \<^verbatim>\<open>Toplevel.present_theory\<close> is missing in Isabelle2021-1, we use a
workaround with an alternative presentation hook: it exports
\<^verbatim>\<open>document/latex_dof\<close> files instead of regular \<^verbatim>\<open>document/latex_dof\<close>.
\<close>
@ -174,10 +145,7 @@ section \<open>Miscellaneous NEWS and Notes\<close>
text \<open>
\<^item> Document preparation: there are many new options etc. that might help
to fine-tune DOF output, e.g. \<^system_option>\<open>document_comment_latex\<close>.
\<^item> ML: Theory_Data / Generic_Data: "val extend = I" has been removed;
obsolete since Isabelle2021.
to fine-tune DOF output, e.g. \<^system_option>\<open>document_heading_prefix\<close>.
\<^item> ML: \<^ML>\<open>Thm.instantiate\<close> and related operations now use explicit
abstract types for the instantiate, see \<^file>\<open>~~/src/Pure/term_items.ML\<close>
@ -196,6 +164,8 @@ text \<open>
example:
\<^ML>\<open>\<^Type>\<open>nat\<close>\<close>
\<^ML>\<open>\<^Type>\<open>prod \<^Type>\<open>int\<close> \<^Type>\<open>int\<close>\<close>\<close>
\<^ML>\<open>fn (A, B) => \<^Type>\<open>fun A B\<close>\<close>
\<^ML>\<open>\<^Type_fn>\<open>fun A B => \<open>(A, B)\<close>\<close>\<close>

View File

@ -67,7 +67,7 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark
val strg = XML.string_of (hd (Latex.output text))
val file = {path = Path.make [oid ^ "_snippet.tex"],
pos = @{here},
content = strg}
content = Bytes.string strg}
val _ = Generated_Files.write_file (Path.make ["latex_test"])
file
@ -125,7 +125,7 @@ val _ =
\<close>
ML\<open>open Bytes\<close>
text\<open>And here a tex - text macro.\<close>
text\<open>Pythons ReStructuredText (RST).
@{url \<open>https://de.wikipedia.org/wiki/ReStructuredText\<close>}. Tool: Sphinx.
@ -352,8 +352,8 @@ Config.get ;
(*
\begin{figure}[h]
\centering
\includegraphics[scale=0.5]{graph_a}
\caption{An example graph}
@ -362,35 +362,37 @@ Config.get ;
\begin{figure}
\centering
\begin{subfigure}[b]{0.3\textwidth}
\centering
\includegraphics[width=\textwidth]{graph1}
\caption{$y=x$}
\label{fig:y equals x}
\label{fig:y equals x} (* PROBLEM *)
\end{subfigure}
\hfill
\begin{subfigure}[b]{0.3\textwidth}
\centering
\includegraphics[width=\textwidth]{graph2}
\caption{$y=3sinx$}
\label{fig:three sin x}
\label{fig:three sin x} (* PROBLEM *)
\end{subfigure}
\hfill
\begin{subfigure}[b]{0.3\textwidth}
\centering
\includegraphics[width=\textwidth]{graph3}
\caption{$y=5/x$}
\label{fig:five over x}
\label{fig:five over x} (* PROBLEM *)
\end{subfigure}
\caption{Three simple graphs}
@ -436,7 +438,7 @@ fun figure_antiq (check: Proof.context -> Path.T option -> Input.source -> Path.
else error "scale out of range (must be between 1 and 100"
val fig_args = Library.enclose "[" "]" (commas [wdth_ltx,scl_ltx])
val _ = writeln cap
fun proportion () = "0."^ (Int.toString (100 div length(!figure_proportions)))
fun proportion () = "0."^ (Int.toString (100 div List.length(!figure_proportions)))
\<comment> \<open>naive: assumes equal proportions\<close>
fun core arg = "\n\\centering\n"
^"\\includegraphics"
@ -502,7 +504,8 @@ Figure*[fff::figure,src="\<open>this is a side-by-side\<close>"]
\<close>
Figure*[ffff::figure,(* caption *) src="\<open>this is another 2 side-by-side\<close>"]
Figure*[ffff::figure2, caption="\<open>this is another 2 side-by-side\<close>"]
\<open>@{figure_content [width=40, scale=35, caption="This is a left test"] "../ROOT"}\<close>
\<open>@{figure_content [width=40, scale=35, caption="This is a right test"] "../ROOT"}\<close>
@ -510,5 +513,18 @@ Figure*[ffff::figure,(* caption *) src="\<open>this is another 2 side-by-side\<c
text\<open> @{figure "ffff(2)"}\<close>
*)
Figure*[figxxx::figure2,caption="\<open>Proofs establishing an Invariant Preservation.\<close>"]
\<open>@{boxed_theory_text [display]
\<open>lemma inv_c2_preserved : "c2_inv \<sigma> \<Longrightarrow> c1_inv (\<sigma> \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e)"
unfolding c1_inv_def c2_inv_def
Computer_Hardware_to_Hardware_morphism_def
Product_to_Component_morphism_def
by (auto simp: comp_def)
lemma Computer_Hardware_to_Hardware_total :
"Computer_Hardware_to_Hardware_morphism ` ({X. c2_inv X})
\<subseteq> ({X::Hardware. c1_inv X})"
using inv_c2_preserved by auto\<close>}\<close>
end
(*>*)

View File

@ -7,6 +7,7 @@ session "Isabelle_DOF-tests" = "Isabelle_DOF" +
"TermAntiquotations"
"Attributes"
"Evaluation"
"Isabelle2021-1"
"High_Level_Syntax_Invariants"
"Ontology_Matching_Example"
theories [condition = ISABELLE_DOF_HOME]
"Isabelle2022"

View File

@ -114,5 +114,15 @@ text\<open>Unfortunately due to different lexical conventions for constant symbo
We need to substitute an hyphen "-" for the underscore "_".\<close>
term*\<open>@{text-element \<open>te\<close>}\<close>
text*[testetst::E]\<open>\<close>
print_doc_items
find_consts name:"trace_update"
ML\<open>
(*val thy = @{theory}*)
val {input_term,...} = the (DOF_core.get_object_global "aaa" @{theory})
val update $ abs $ record = input_term
val make $ tag $ attr = record
\<close>
print_doc_classes
end