forked from Isabelle_DOF/Isabelle_DOF
Merge remote-tracking branch 'origin/Unreleased/Isabelle2020-RC4'
This commit is contained in:
commit
c320b58dd3
6
.config
6
.config
|
@ -6,9 +6,9 @@ DOF_LATEST_DOI="10.5281/zenodo.3370483"
|
||||||
DOF_GENERIC_DOI="10.5281/zenodo.3370482"
|
DOF_GENERIC_DOI="10.5281/zenodo.3370482"
|
||||||
#
|
#
|
||||||
# Isabelle and AFP Configuration
|
# Isabelle and AFP Configuration
|
||||||
ISABELLE_VERSION="Isabelle2019: June 2019"
|
ISABELLE_VERSION="Isabelle2020: April 2020"
|
||||||
ISABELLE_URL="https://isabelle.in.tum.de/website-Isabelle2019/"
|
ISABELLE_URL="https://isabelle.in.tum.de/website-Isabelle2020/"
|
||||||
AFP_DATE="afp-2019-06-17"
|
AFP_DATE="afp-2020-04-20"
|
||||||
AFP_URL="https://www.isa-afp.org/release/"$AFP_DATE".tar.gz"
|
AFP_URL="https://www.isa-afp.org/release/"$AFP_DATE".tar.gz"
|
||||||
#
|
#
|
||||||
# Isabelle/DOF Repository Configuration
|
# Isabelle/DOF Repository Configuration
|
||||||
|
|
|
@ -20,9 +20,9 @@ foo@bar:~$ docker run -ti --rm -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-u
|
||||||
|
|
||||||
Isabelle/DOF has two major pre-requisites:
|
Isabelle/DOF has two major pre-requisites:
|
||||||
|
|
||||||
* **Isabelle:** Isabelle/DOF requires [Isabelle 2019](http://isabelle.in.tum.de/website-Isabelle2019/).
|
* **Isabelle:** Isabelle/DOF requires [Isabelle 2020](http://isabelle.in.tum.de/website-Isabelle2020/).
|
||||||
Please download the Isabelle 2019 distribution for your operating
|
Please download the Isabelle 2020 distribution for your operating
|
||||||
system from the [Isabelle website](http://isabelle.in.tum.de/website-Isabelle2019/).
|
system from the [Isabelle website](http://isabelle.in.tum.de/website-Isabelle2020/).
|
||||||
* **LaTeX:** Isabelle/DOF requires a modern pdfTeX-engine supporting the \expanded{}-primitive. This
|
* **LaTeX:** Isabelle/DOF requires a modern pdfTeX-engine supporting the \expanded{}-primitive. This
|
||||||
is, for example, included in the [TeXLive 2019](https://www.tug.org/texlive/) (or later)
|
is, for example, included in the [TeXLive 2019](https://www.tug.org/texlive/) (or later)
|
||||||
distribution. Please follow the [TeXLive installation instructions](https://www.tug.org/texlive/acquire-netinstall.html)
|
distribution. Please follow the [TeXLive installation instructions](https://www.tug.org/texlive/acquire-netinstall.html)
|
||||||
|
@ -41,7 +41,7 @@ one), the full path to the ``isabelle`` command needs to be passed as
|
||||||
using the ``--isabelle`` command line argument of the ``install`` script:
|
using the ``--isabelle`` command line argument of the ``install`` script:
|
||||||
|
|
||||||
```console
|
```console
|
||||||
foo@bar:~$ ./install --isabelle /usr/local/Isabelle2019/bin/isabelle
|
foo@bar:~$ ./install --isabelle /usr/local/Isabelle2020/bin/isabelle
|
||||||
```
|
```
|
||||||
|
|
||||||
For further command line options of the installer, please use the
|
For further command line options of the installer, please use the
|
||||||
|
|
|
@ -51,7 +51,7 @@ subsubsection*[prerequisites::technical]\<open>Pre-requisites\<close>
|
||||||
text\<open>
|
text\<open>
|
||||||
\isadof has to major pre-requisites:
|
\isadof has to major pre-requisites:
|
||||||
\<^item> \<^bold>\<open>Isabelle\<close>\bindex{Isabelle} (\isabellefullversion).
|
\<^item> \<^bold>\<open>Isabelle\<close>\bindex{Isabelle} (\isabellefullversion).
|
||||||
\isadof uses a two-part version system (e.g., 1.0.0/2019),
|
\isadof uses a two-part version system (e.g., 1.0.0/2020),
|
||||||
where the first part is the version of \isadof (using semantic versioning) and the second
|
where the first part is the version of \isadof (using semantic versioning) and the second
|
||||||
part is the supported version of Isabelle. Thus, the same version of \isadof might be
|
part is the supported version of Isabelle. Thus, the same version of \isadof might be
|
||||||
available for different versions of Isabelle.
|
available for different versions of Isabelle.
|
||||||
|
@ -266,7 +266,7 @@ users are:
|
||||||
\<^item> The file \inlinebash|ROOT|\index{ROOT}, which defines the Isabelle session. New theory files as well as new
|
\<^item> The file \inlinebash|ROOT|\index{ROOT}, which defines the Isabelle session. New theory files as well as new
|
||||||
files required by the document generation (\eg, images, bibliography database using \BibTeX, local
|
files required by the document generation (\eg, images, bibliography database using \BibTeX, local
|
||||||
\LaTeX-styles) need to be registered in this file. For details of Isabelle's build system, please
|
\LaTeX-styles) need to be registered in this file. For details of Isabelle's build system, please
|
||||||
consult the Isabelle System Manual~@{cite "wenzel:system-manual:2019"}.
|
consult the Isabelle System Manual~@{cite "wenzel:system-manual:2020"}.
|
||||||
\<^item> The file \inlinebash|praemble.tex|\index{praemble.tex}, which allows users to add additional
|
\<^item> The file \inlinebash|praemble.tex|\index{praemble.tex}, which allows users to add additional
|
||||||
\LaTeX-packages or to add/modify \LaTeX-commands.
|
\LaTeX-packages or to add/modify \LaTeX-commands.
|
||||||
\<close>
|
\<close>
|
||||||
|
@ -552,7 +552,7 @@ figure*[figfig3::figure,relative_width="95",src="''figures/antiquotations-PIDE''
|
||||||
\<open> Standard antiquotations referring to theory elements.\<close>
|
\<open> Standard antiquotations referring to theory elements.\<close>
|
||||||
text\<open> The corresponding view in @{docitem \<open>figfig3\<close>} shows core part of a document
|
text\<open> The corresponding view in @{docitem \<open>figfig3\<close>} shows core part of a document
|
||||||
conformimg to the CENELEC 50128 ontology. The first sample shows standard Isabelle antiquotations
|
conformimg to the CENELEC 50128 ontology. The first sample shows standard Isabelle antiquotations
|
||||||
@{cite "wenzel:isabelle-isar:2019"} into formal entities of a theory. This way, the informal parts
|
@{cite "wenzel:isabelle-isar:2020"} into formal entities of a theory. This way, the informal parts
|
||||||
of a document get ``formal content'' and become more robust under change.\<close>
|
of a document get ``formal content'' and become more robust under change.\<close>
|
||||||
|
|
||||||
figure*[figfig5::figure, relative_width="95", src="''figures/srac-definition''"]
|
figure*[figfig5::figure, relative_width="95", src="''figures/srac-definition''"]
|
||||||
|
|
|
@ -220,22 +220,22 @@ text\<open>
|
||||||
As ODL is an extension of Isabelle/HOL, document class definitions can therefore be arbitrarily
|
As ODL is an extension of Isabelle/HOL, document class definitions can therefore be arbitrarily
|
||||||
mixed with standard HOL specification constructs. To make this manual self-contained, we present
|
mixed with standard HOL specification constructs. To make this manual self-contained, we present
|
||||||
syntax and semantics of the specification constructs that are most likely relevant for the
|
syntax and semantics of the specification constructs that are most likely relevant for the
|
||||||
developer of ontologies (for more details, see~@{cite "wenzel:isabelle-isar:2019"}. Our
|
developer of ontologies (for more details, see~@{cite "wenzel:isabelle-isar:2020"}. Our
|
||||||
presentation is a simplification of the original sources following the needs of ontology developers
|
presentation is a simplification of the original sources following the needs of ontology developers
|
||||||
in \isadof:
|
in \isadof:
|
||||||
\<^item> \<open>name\<close>:\index{name@\<open>name\<close>}
|
\<^item> \<open>name\<close>:\index{name@\<open>name\<close>}
|
||||||
with the syntactic category of \<open>name\<close>'s we refer to alpha-numerical identifiers
|
with the syntactic category of \<open>name\<close>'s we refer to alpha-numerical identifiers
|
||||||
(called \<open>short_id\<close>'s in @{cite "wenzel:isabelle-isar:2019"}) and identifiers
|
(called \<open>short_id\<close>'s in @{cite "wenzel:isabelle-isar:2020"}) and identifiers
|
||||||
in \inlineisar+" ... "+ which might contain certain ``quasi-letters'' such
|
in \inlineisar+" ... "+ which might contain certain ``quasi-letters'' such
|
||||||
as \inlineisar+_+, \inlineisar+-+, \inlineisar+.+ (see~@{cite "wenzel:isabelle-isar:2019"} for
|
as \inlineisar+_+, \inlineisar+-+, \inlineisar+.+ (see~@{cite "wenzel:isabelle-isar:2020"} for
|
||||||
details).
|
details).
|
||||||
\<^item> \<open>tyargs\<close>:\index{tyargs@\<open>tyargs\<close>}
|
\<^item> \<open>tyargs\<close>:\index{tyargs@\<open>tyargs\<close>}
|
||||||
\<^rail>\<open> typefree | ('(' (typefree * ',') ')')\<close>
|
\<^rail>\<open> typefree | ('(' (typefree * ',') ')')\<close>
|
||||||
\<open>typefree\<close> denotes fixed type variable(\<open>'a\<close>, \<open>'b\<close>, ...) (see~@{cite "wenzel:isabelle-isar:2019"})
|
\<open>typefree\<close> denotes fixed type variable(\<open>'a\<close>, \<open>'b\<close>, ...) (see~@{cite "wenzel:isabelle-isar:2020"})
|
||||||
\<^item> \<open>dt_name\<close>:\index{dt\_npurdahame@\<open>dt_name\<close>}
|
\<^item> \<open>dt_name\<close>:\index{dt\_npurdahame@\<open>dt_name\<close>}
|
||||||
\<^rail>\<open> (tyargs?) name (mixfix?)\<close>
|
\<^rail>\<open> (tyargs?) name (mixfix?)\<close>
|
||||||
The syntactic entity \<open>name\<close> denotes an identifier, \<open>mixfix\<close> denotes the usual
|
The syntactic entity \<open>name\<close> denotes an identifier, \<open>mixfix\<close> denotes the usual
|
||||||
parenthesized mixfix notation (see @{cite "wenzel:isabelle-isar:2019"}).
|
parenthesized mixfix notation (see @{cite "wenzel:isabelle-isar:2020"}).
|
||||||
The \<open>name\<close>'s referred here are type names such as \<^verbatim>\<open>int\<close>, \<^verbatim>\<open>string\<close>, \<^verbatim>\<open>list\<close>, \<^verbatim>\<open>set\<close>, etc.
|
The \<open>name\<close>'s referred here are type names such as \<^verbatim>\<open>int\<close>, \<^verbatim>\<open>string\<close>, \<^verbatim>\<open>list\<close>, \<^verbatim>\<open>set\<close>, etc.
|
||||||
\<^item> \<open>type_spec\<close>:\index{type_spec@\<open>type_spec\<close>}
|
\<^item> \<open>type_spec\<close>:\index{type_spec@\<open>type_spec\<close>}
|
||||||
\<^rail>\<open> (tyargs?) name\<close>
|
\<^rail>\<open> (tyargs?) name\<close>
|
||||||
|
@ -256,13 +256,13 @@ text\<open>
|
||||||
mathematical notations for $\lambda$-terms in Isabelle/HOL. Example expressions are:
|
mathematical notations for $\lambda$-terms in Isabelle/HOL. Example expressions are:
|
||||||
\inlineisar|1+2| (arithmetics), \inlineisar|[1,2,3]| (lists), \inlineisar|''ab c''| (strings),
|
\inlineisar|1+2| (arithmetics), \inlineisar|[1,2,3]| (lists), \inlineisar|''ab c''| (strings),
|
||||||
\inlineisar|{1,2,3}| (sets), \inlineisar|(1,2,3)| (tuples),
|
\inlineisar|{1,2,3}| (sets), \inlineisar|(1,2,3)| (tuples),
|
||||||
\inlineisar|\<forall> x. P(x) \<and> Q x = C| (formulas). For details, see~@{cite "nipkow:whats:2019"}.
|
\inlineisar|\<forall> x. P(x) \<and> Q x = C| (formulas). For details, see~@{cite "nipkow:whats:2020"}.
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
text\<open>
|
text\<open>
|
||||||
Advanced ontologies can, \eg, use recursive function definitions with
|
Advanced ontologies can, \eg, use recursive function definitions with
|
||||||
pattern-matching~@{cite "kraus:defining:2019"}, extensible record
|
pattern-matching~@{cite "kraus:defining:2020"}, extensible record
|
||||||
pecifications~@{cite "wenzel:isabelle-isar:2019"}, and abstract type declarations.
|
pecifications~@{cite "wenzel:isabelle-isar:2020"}, and abstract type declarations.
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
text\<open>Note that \isadof works internally with fully qualified names in order to avoid
|
text\<open>Note that \isadof works internally with fully qualified names in order to avoid
|
||||||
|
@ -554,7 +554,7 @@ subsubsection\<open>Experts: Defining New Top-Level Commands\<close>
|
||||||
text\<open>
|
text\<open>
|
||||||
Defining such new top-level commands requires some Isabelle knowledge as well as
|
Defining such new top-level commands requires some Isabelle knowledge as well as
|
||||||
extending the dispatcher of the \LaTeX-backend. For the details of defining top-level
|
extending the dispatcher of the \LaTeX-backend. For the details of defining top-level
|
||||||
commands, we refer the reader to the Isar manual~@{cite "wenzel:isabelle-isar:2019"}.
|
commands, we refer the reader to the Isar manual~@{cite "wenzel:isabelle-isar:2020"}.
|
||||||
Here, we only give a brief example how the \inlineisar|section*|-command is defined; we
|
Here, we only give a brief example how the \inlineisar|section*|-command is defined; we
|
||||||
refer the reader to the source code of \isadof for details.
|
refer the reader to the source code of \isadof for details.
|
||||||
|
|
||||||
|
|
|
@ -33,7 +33,7 @@ text\<open>
|
||||||
\<close>
|
\<close>
|
||||||
text\<open>
|
text\<open>
|
||||||
Semantic macros, as required by our document model, are called \<^emph>\<open>document antiquotations\<close>
|
Semantic macros, as required by our document model, are called \<^emph>\<open>document antiquotations\<close>
|
||||||
in the Isabelle literature~@{cite "wenzel:isabelle-isar:2019"}. While Isabelle's code-antiquotations
|
in the Isabelle literature~@{cite "wenzel:isabelle-isar:2020"}. While Isabelle's code-antiquotations
|
||||||
are an old concept going back to Lisp and having found via SML and OCaml their ways into modern
|
are an old concept going back to Lisp and having found via SML and OCaml their ways into modern
|
||||||
proof systems, special annotation syntax inside documentation comments have their roots in
|
proof systems, special annotation syntax inside documentation comments have their roots in
|
||||||
documentation generators such as Javadoc. Their use, however, as a mechanism to embed
|
documentation generators such as Javadoc. Their use, however, as a mechanism to embed
|
||||||
|
|
|
@ -17,10 +17,10 @@
|
||||||
year = 2019
|
year = 2019
|
||||||
}
|
}
|
||||||
|
|
||||||
@Manual{ wenzel:isabelle-isar:2019,
|
@Manual{ wenzel:isabelle-isar:2020,
|
||||||
title = {The Isabelle/Isar Reference Manual},
|
title = {The Isabelle/Isar Reference Manual},
|
||||||
author = {Makarius Wenzel},
|
author = {Makarius Wenzel},
|
||||||
year = 2019,
|
year = 2020,
|
||||||
note = {Part of the Isabelle distribution.}
|
note = {Part of the Isabelle distribution.}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -41,7 +41,7 @@
|
||||||
ontological feedback during the editing of a document.
|
ontological feedback during the editing of a document.
|
||||||
|
|
||||||
In this paper, we give an in-depth presentation of the
|
In this paper, we give an in-depth presentation of the
|
||||||
design concepts of DOF{\^a}s Ontology Definition Language
|
design concepts of DOF's Ontology Definition Language
|
||||||
(ODL) and key aspects of the technology of its
|
(ODL) and key aspects of the technology of its
|
||||||
implementation. Isabelle/DOF is the first ontology language
|
implementation. Isabelle/DOF is the first ontology language
|
||||||
supporting machine-checked links between the formal and
|
supporting machine-checked links between the formal and
|
||||||
|
@ -321,18 +321,18 @@
|
||||||
year = 2019
|
year = 2019
|
||||||
}
|
}
|
||||||
|
|
||||||
@Misc{ kraus:defining:2019,
|
@Misc{ kraus:defining:2020,
|
||||||
title = {Defining Recursive Functions in Isabelle/HOL},
|
title = {Defining Recursive Functions in Isabelle/HOL},
|
||||||
author = {Alexander Kraus},
|
author = {Alexander Kraus},
|
||||||
note = {\url{https://isabelle.in.tum.de/doc/functions.pdf}},
|
note = {\url{https://isabelle.in.tum.de/doc/functions.pdf}},
|
||||||
year = 2019
|
year = 2020
|
||||||
}
|
}
|
||||||
|
|
||||||
@Misc{ nipkow:whats:2019,
|
@Misc{ nipkow:whats:2020,
|
||||||
title = {What's in Main},
|
title = {What's in Main},
|
||||||
author = {Tobias Nipkow},
|
author = {Tobias Nipkow},
|
||||||
note = {\url{https://isabelle.in.tum.de/doc/main.pdf}},
|
note = {\url{https://isabelle.in.tum.de/doc/main.pdf}},
|
||||||
year = 2019
|
year = 2020
|
||||||
}
|
}
|
||||||
|
|
||||||
@InProceedings{ wenzel:system:2014,
|
@InProceedings{ wenzel:system:2014,
|
||||||
|
@ -455,10 +455,10 @@
|
||||||
year = 2019
|
year = 2019
|
||||||
}
|
}
|
||||||
|
|
||||||
@Booklet{ wenzel:system-manual:2019,
|
@Booklet{ wenzel:system-manual:2020,
|
||||||
author = {Makarius Wenzel},
|
author = {Makarius Wenzel},
|
||||||
title = {The {Isabelle} System Manual},
|
title = {The {Isabelle} System Manual},
|
||||||
year = 2019,
|
year = 2020,
|
||||||
note = {Part of the Isabelle distribution.}
|
note = {Part of the Isabelle distribution.}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -14,7 +14,6 @@
|
||||||
(*<*)
|
(*<*)
|
||||||
theory TR_MyCommentedIsabelle
|
theory TR_MyCommentedIsabelle
|
||||||
imports "Isabelle_DOF.technical_report"
|
imports "Isabelle_DOF.technical_report"
|
||||||
(*imports "../../../ontologies/technical_report"*)
|
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
|
||||||
|
@ -22,7 +21,7 @@ open_monitor*[this::report]
|
||||||
(*>*)
|
(*>*)
|
||||||
|
|
||||||
title*[tit::title]\<open>My Personal, Ecclectic Isabelle Programming Manual\<close>
|
title*[tit::title]\<open>My Personal, Ecclectic Isabelle Programming Manual\<close>
|
||||||
subtitle*[stit::subtitle]\<open>Version : Isabelle 2019\<close>
|
subtitle*[stit::subtitle]\<open>Version : Isabelle 2020\<close>
|
||||||
text*[bu::author,
|
text*[bu::author,
|
||||||
email = "''wolff@lri.fr''",
|
email = "''wolff@lri.fr''",
|
||||||
affiliation = "\<open>Université Paris-Saclay, LRI, France\<close>"]\<open>Burkhart Wolff\<close>
|
affiliation = "\<open>Université Paris-Saclay, LRI, France\<close>"]\<open>Burkhart Wolff\<close>
|
||||||
|
@ -784,9 +783,9 @@ Theory.setup: (theory -> theory) -> unit; (* The thing to extend the table of "
|
||||||
Theory.get_markup: theory -> Markup.T;
|
Theory.get_markup: theory -> Markup.T;
|
||||||
Theory.axiom_table: theory -> term Name_Space.table;
|
Theory.axiom_table: theory -> term Name_Space.table;
|
||||||
Theory.axiom_space: theory -> Name_Space.T;
|
Theory.axiom_space: theory -> Name_Space.T;
|
||||||
Theory.axioms_of: theory -> (string * term) list;
|
|
||||||
Theory.all_axioms_of: theory -> (string * term) list;
|
Theory.all_axioms_of: theory -> (string * term) list;
|
||||||
Theory.defs_of: theory -> Defs.T;
|
Theory.defs_of: theory -> Defs.T;
|
||||||
|
Theory.join_theory: theory list -> theory;
|
||||||
Theory.at_begin: (theory -> theory option) -> theory -> theory;
|
Theory.at_begin: (theory -> theory option) -> theory -> theory;
|
||||||
Theory.at_end: (theory -> theory option) -> theory -> theory;
|
Theory.at_end: (theory -> theory option) -> theory -> theory;
|
||||||
Theory.begin_theory: string * Position.T -> theory list -> theory;
|
Theory.begin_theory: string * Position.T -> theory list -> theory;
|
||||||
|
@ -1016,7 +1015,7 @@ fun derive_thm name term lthy =
|
||||||
[] (* local assumption context *)
|
[] (* local assumption context *)
|
||||||
(term) (* parsed goal *)
|
(term) (* parsed goal *)
|
||||||
(fn _ => simp_tac lthy 1) (* proof tactic *)
|
(fn _ => simp_tac lthy 1) (* proof tactic *)
|
||||||
|> Thm.close_derivation (* some cleanups *);
|
|> Thm.close_derivation \<^here> (* some cleanups *);
|
||||||
|
|
||||||
val thm111_intern = derive_thm "thm111" tt @{context} (* just for fun at the ML level *)
|
val thm111_intern = derive_thm "thm111" tt @{context} (* just for fun at the ML level *)
|
||||||
|
|
||||||
|
@ -1035,7 +1034,7 @@ text\<open>Converting a local theory transformation into a global one:\<close>
|
||||||
setup\<open>SimpleSampleProof.prove_n_store\<close>
|
setup\<open>SimpleSampleProof.prove_n_store\<close>
|
||||||
|
|
||||||
text\<open>... and there it is in the global (Isar) context:\<close>
|
text\<open>... and there it is in the global (Isar) context:\<close>
|
||||||
thm thm111
|
thm "thm111"
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@ -2038,7 +2037,7 @@ text\<open>The heart of the LaTeX generator is to be found in the structure @{ML
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
ML\<open>
|
ML\<open>
|
||||||
local
|
local
|
||||||
|
|
||||||
open Latex
|
open Latex
|
||||||
|
|
||||||
|
|
4
install
4
install
|
@ -1,7 +1,7 @@
|
||||||
#!/usr/bin/env bash
|
#!/usr/bin/env bash
|
||||||
# Copyright (c) 2018-2019 The University of Sheffield.
|
# Copyright (c) 2018-2019 The University of Sheffield.
|
||||||
# 2019-2019 The University of Exeter.
|
# 2019-2020 The University of Exeter.
|
||||||
# 2018-2019 The University of Paris-Saclay.
|
# 2018-2020 The University of Paris-Saclay.
|
||||||
#
|
#
|
||||||
# Redistribution and use in source and binary forms, with or without
|
# Redistribution and use in source and binary forms, with or without
|
||||||
# modification, are permitted provided that the following conditions
|
# modification, are permitted provided that the following conditions
|
||||||
|
|
|
@ -48,7 +48,7 @@ fun assert_cmd some_name modes raw_t ctxt (* state*) =
|
||||||
val ty' = Term.type_of t';
|
val ty' = Term.type_of t';
|
||||||
val ty' = case ty' of @{typ "bool"} => ty' | _ => error "Assertion expressions must be boolean.";
|
val ty' = case ty' of @{typ "bool"} => ty' | _ => error "Assertion expressions must be boolean.";
|
||||||
val t' = case t' of @{term "True"} => t' | _ => error "Assertion failed.";
|
val t' = case t' of @{term "True"} => t' | _ => error "Assertion failed.";
|
||||||
val ctxt' = Variable.auto_fixes t' ctxt;
|
val ctxt' = Proof_Context.augment t' ctxt;
|
||||||
val p = Print_Mode.with_modes modes (fn () =>
|
val p = Print_Mode.with_modes modes (fn () =>
|
||||||
Pretty.block [Pretty.quote (Syntax.pretty_term ctxt' t'), Pretty.fbrk,
|
Pretty.block [Pretty.quote (Syntax.pretty_term ctxt' t'), Pretty.fbrk,
|
||||||
Pretty.str "::", Pretty.brk 1, Pretty.quote (Syntax.pretty_typ ctxt' ty')]) ();
|
Pretty.str "::", Pretty.brk 1, Pretty.quote (Syntax.pretty_typ ctxt' ty')]) ();
|
||||||
|
|
17
src/ROOT
17
src/ROOT
|
@ -2,15 +2,32 @@ session "Isabelle_DOF" = "Functional-Automata" +
|
||||||
options [document = pdf, document_output = "output"]
|
options [document = pdf, document_output = "output"]
|
||||||
sessions
|
sessions
|
||||||
"Regular-Sets"
|
"Regular-Sets"
|
||||||
|
directories
|
||||||
|
"DOF"
|
||||||
|
"ontologies"
|
||||||
|
"ontologies/CENELEC_50128"
|
||||||
|
"ontologies/Conceptual"
|
||||||
|
"ontologies/math_exam"
|
||||||
|
"ontologies/math_paper"
|
||||||
|
"ontologies/scholarly_paper"
|
||||||
|
"ontologies/small_math"
|
||||||
|
"ontologies/technical_report"
|
||||||
theories
|
theories
|
||||||
"DOF/Isa_DOF"
|
"DOF/Isa_DOF"
|
||||||
"ontologies/ontologies"
|
"ontologies/ontologies"
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
(*
|
||||||
session "Isabelle_DOF-tests" = "Isabelle_DOF" +
|
session "Isabelle_DOF-tests" = "Isabelle_DOF" +
|
||||||
options [document = false]
|
options [document = false]
|
||||||
|
directories
|
||||||
|
tests
|
||||||
theories
|
theories
|
||||||
"tests/AssnsLemmaThmEtc"
|
"tests/AssnsLemmaThmEtc"
|
||||||
"tests/Concept_ExampleInvariant"
|
"tests/Concept_ExampleInvariant"
|
||||||
"tests/Concept_Example"
|
"tests/Concept_Example"
|
||||||
"tests/InnerSyntaxAntiquotations"
|
"tests/InnerSyntaxAntiquotations"
|
||||||
"tests/Attributes"
|
"tests/Attributes"
|
||||||
|
*)
|
||||||
|
|
|
@ -14,7 +14,7 @@
|
||||||
|
|
||||||
\NeedsTeXFormat{LaTeX2e}\relax
|
\NeedsTeXFormat{LaTeX2e}\relax
|
||||||
\ProvidesPackage{DOF-cenelec_50128}
|
\ProvidesPackage{DOF-cenelec_50128}
|
||||||
[2019/08/18 Unreleased/Isabelle2019%
|
[2019/08/18 Unreleased/Isabelle2020%
|
||||||
Document-Type Support Framework for Isabelle (CENELEC 50128).]
|
Document-Type Support Framework for Isabelle (CENELEC 50128).]
|
||||||
|
|
||||||
\RequirePackage{DOF-COL}
|
\RequirePackage{DOF-COL}
|
||||||
|
|
|
@ -14,7 +14,7 @@
|
||||||
|
|
||||||
\NeedsTeXFormat{LaTeX2e}\relax
|
\NeedsTeXFormat{LaTeX2e}\relax
|
||||||
\ProvidesPackage{DOF-scholarly_paper}
|
\ProvidesPackage{DOF-scholarly_paper}
|
||||||
[2020/01/14 Unreleased/Isabelle2019%
|
[2020/01/14 Unreleased/Isabelle2020%
|
||||||
Document-Type Support Framework for Isabelle (LNCS).]
|
Document-Type Support Framework for Isabelle (LNCS).]
|
||||||
|
|
||||||
\RequirePackage{DOF-COL}
|
\RequirePackage{DOF-COL}
|
||||||
|
|
|
@ -1,495 +0,0 @@
|
||||||
(* Title: Pure/Thy/thy_info.ML
|
|
||||||
Author: Markus Wenzel, TU Muenchen
|
|
||||||
|
|
||||||
Global theory info database, with auto-loading according to theory and
|
|
||||||
file dependencies.
|
|
||||||
*)
|
|
||||||
|
|
||||||
signature THY_INFO =
|
|
||||||
sig
|
|
||||||
type presentation_context =
|
|
||||||
{options: Options.T, file_pos: Position.T, adjust_pos: Position.T -> Position.T,
|
|
||||||
segments: Thy_Output.segment list}
|
|
||||||
val apply_presentation: presentation_context -> theory -> unit
|
|
||||||
val add_presentation: (presentation_context -> theory -> unit) -> theory -> theory
|
|
||||||
val get_names: unit -> string list
|
|
||||||
val lookup_theory: string -> theory option
|
|
||||||
val get_theory: string -> theory
|
|
||||||
val master_directory: string -> Path.T
|
|
||||||
val remove_thy: string -> unit
|
|
||||||
type context =
|
|
||||||
{options: Options.T,
|
|
||||||
symbols: HTML.symbols,
|
|
||||||
bibtex_entries: string list,
|
|
||||||
last_timing: Toplevel.transition -> Time.time}
|
|
||||||
val use_theories: context -> string -> Path.T -> (string * Position.T) list -> unit
|
|
||||||
val use_thy: string -> unit
|
|
||||||
val script_thy: Position.T -> string -> theory -> theory
|
|
||||||
val register_thy: theory -> unit
|
|
||||||
val finish: unit -> unit
|
|
||||||
end;
|
|
||||||
|
|
||||||
structure Thy_Info: THY_INFO =
|
|
||||||
struct
|
|
||||||
|
|
||||||
(** presentation of consolidated theory **)
|
|
||||||
|
|
||||||
type presentation_context =
|
|
||||||
{options: Options.T, file_pos: Position.T, adjust_pos: Position.T -> Position.T,
|
|
||||||
segments: Thy_Output.segment list};
|
|
||||||
|
|
||||||
structure Presentation = Theory_Data
|
|
||||||
(
|
|
||||||
type T = ((presentation_context -> theory -> unit) * stamp) list;
|
|
||||||
val empty = [];
|
|
||||||
val extend = I;
|
|
||||||
fun merge data : T = Library.merge (eq_snd op =) data;
|
|
||||||
);
|
|
||||||
|
|
||||||
fun apply_presentation (context: presentation_context) thy =
|
|
||||||
ignore (Presentation.get thy |> Par_List.map (fn (f, _) => f context thy));
|
|
||||||
|
|
||||||
fun add_presentation f = Presentation.map (cons (f, stamp ()));
|
|
||||||
|
|
||||||
val _ =
|
|
||||||
Theory.setup (add_presentation (fn {options, file_pos, segments, ...} => fn thy =>
|
|
||||||
if exists (Toplevel.is_skipped_proof o #state) segments then ()
|
|
||||||
else
|
|
||||||
let
|
|
||||||
val body = Thy_Output.present_thy options thy segments;
|
|
||||||
val option = Present.document_option options;
|
|
||||||
in
|
|
||||||
if #disabled option then ()
|
|
||||||
else
|
|
||||||
let
|
|
||||||
val latex = Latex.isabelle_body (Context.theory_name thy) body;
|
|
||||||
val output = [Latex.output_text latex, Latex.output_positions file_pos latex];
|
|
||||||
val _ =
|
|
||||||
if Options.bool options "export_document"
|
|
||||||
then Export.export thy "document.tex" output else ();
|
|
||||||
val _ = if #enabled option then Present.theory_output thy output else ();
|
|
||||||
in () end
|
|
||||||
end));
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(** thy database **)
|
|
||||||
|
|
||||||
(* messages *)
|
|
||||||
|
|
||||||
val show_path = space_implode " via " o map quote;
|
|
||||||
|
|
||||||
fun cycle_msg names = "Cyclic dependency of " ^ show_path names;
|
|
||||||
|
|
||||||
|
|
||||||
(* derived graph operations *)
|
|
||||||
|
|
||||||
fun add_deps name parents G = String_Graph.add_deps_acyclic (name, parents) G
|
|
||||||
handle String_Graph.CYCLES namess => error (cat_lines (map cycle_msg namess));
|
|
||||||
|
|
||||||
fun new_entry name parents entry =
|
|
||||||
String_Graph.new_node (name, entry) #> add_deps name parents;
|
|
||||||
|
|
||||||
|
|
||||||
(* global thys *)
|
|
||||||
|
|
||||||
type deps =
|
|
||||||
{master: (Path.T * SHA1.digest), (*master dependencies for thy file*)
|
|
||||||
imports: (string * Position.T) list}; (*source specification of imports (partially qualified)*)
|
|
||||||
|
|
||||||
fun make_deps master imports : deps = {master = master, imports = imports};
|
|
||||||
|
|
||||||
fun master_dir_deps (d: deps option) =
|
|
||||||
the_default Path.current (Option.map (Path.dir o #1 o #master) d);
|
|
||||||
|
|
||||||
local
|
|
||||||
val global_thys =
|
|
||||||
Synchronized.var "Thy_Info.thys"
|
|
||||||
(String_Graph.empty: (deps option * theory option) String_Graph.T);
|
|
||||||
in
|
|
||||||
fun get_thys () = Synchronized.value global_thys;
|
|
||||||
fun change_thys f = Synchronized.change global_thys f;
|
|
||||||
end;
|
|
||||||
|
|
||||||
fun get_names () = String_Graph.topological_order (get_thys ());
|
|
||||||
|
|
||||||
|
|
||||||
(* access thy *)
|
|
||||||
|
|
||||||
fun lookup thys name = try (String_Graph.get_node thys) name;
|
|
||||||
fun lookup_thy name = lookup (get_thys ()) name;
|
|
||||||
|
|
||||||
fun get thys name =
|
|
||||||
(case lookup thys name of
|
|
||||||
SOME thy => thy
|
|
||||||
| NONE => error ("Theory loader: nothing known about theory " ^ quote name));
|
|
||||||
|
|
||||||
fun get_thy name = get (get_thys ()) name;
|
|
||||||
|
|
||||||
|
|
||||||
(* access deps *)
|
|
||||||
|
|
||||||
val lookup_deps = Option.map #1 o lookup_thy;
|
|
||||||
|
|
||||||
val master_directory = master_dir_deps o #1 o get_thy;
|
|
||||||
|
|
||||||
|
|
||||||
(* access theory *)
|
|
||||||
|
|
||||||
fun lookup_theory name =
|
|
||||||
(case lookup_thy name of
|
|
||||||
SOME (_, SOME theory) => SOME theory
|
|
||||||
| _ => NONE);
|
|
||||||
|
|
||||||
fun get_theory name =
|
|
||||||
(case lookup_theory name of
|
|
||||||
SOME theory => theory
|
|
||||||
| _ => error ("Theory loader: undefined entry for theory " ^ quote name));
|
|
||||||
|
|
||||||
val get_imports = Resources.imports_of o get_theory;
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(** thy operations **)
|
|
||||||
|
|
||||||
(* remove *)
|
|
||||||
|
|
||||||
fun remove name thys =
|
|
||||||
(case lookup thys name of
|
|
||||||
NONE => thys
|
|
||||||
| SOME (NONE, _) => error ("Cannot update finished theory " ^ quote name)
|
|
||||||
| SOME _ =>
|
|
||||||
let
|
|
||||||
val succs = String_Graph.all_succs thys [name];
|
|
||||||
val _ = writeln ("Theory loader: removing " ^ commas_quote succs);
|
|
||||||
in fold String_Graph.del_node succs thys end);
|
|
||||||
|
|
||||||
val remove_thy = change_thys o remove;
|
|
||||||
|
|
||||||
|
|
||||||
(* update *)
|
|
||||||
|
|
||||||
fun update deps theory thys =
|
|
||||||
let
|
|
||||||
val name = Context.theory_long_name theory;
|
|
||||||
val parents = map Context.theory_long_name (Theory.parents_of theory);
|
|
||||||
|
|
||||||
val thys' = remove name thys;
|
|
||||||
val _ = map (get thys') parents;
|
|
||||||
in new_entry name parents (SOME deps, SOME theory) thys' end;
|
|
||||||
|
|
||||||
fun update_thy deps theory = change_thys (update deps theory);
|
|
||||||
|
|
||||||
|
|
||||||
(* context *)
|
|
||||||
|
|
||||||
type context =
|
|
||||||
{options: Options.T,
|
|
||||||
symbols: HTML.symbols,
|
|
||||||
bibtex_entries: string list,
|
|
||||||
last_timing: Toplevel.transition -> Time.time};
|
|
||||||
|
|
||||||
fun default_context (): context =
|
|
||||||
{options = Options.default (),
|
|
||||||
symbols = HTML.no_symbols,
|
|
||||||
bibtex_entries = [],
|
|
||||||
last_timing = K Time.zeroTime};
|
|
||||||
|
|
||||||
|
|
||||||
(* scheduling loader tasks *)
|
|
||||||
|
|
||||||
datatype result =
|
|
||||||
Result of {theory: theory, exec_id: Document_ID.exec,
|
|
||||||
present: unit -> unit, commit: unit -> unit, weight: int};
|
|
||||||
|
|
||||||
fun theory_result theory =
|
|
||||||
Result {theory = theory, exec_id = Document_ID.none, present = I, commit = I, weight = 0};
|
|
||||||
|
|
||||||
fun result_theory (Result {theory, ...}) = theory;
|
|
||||||
fun result_present (Result {present, ...}) = present;
|
|
||||||
fun result_commit (Result {commit, ...}) = commit;
|
|
||||||
fun result_ord (Result {weight = i, ...}, Result {weight = j, ...}) = int_ord (j, i);
|
|
||||||
|
|
||||||
fun join_theory (Result {theory, exec_id, ...}) =
|
|
||||||
let
|
|
||||||
val _ = Execution.join [exec_id];
|
|
||||||
val res = Exn.capture Thm.consolidate_theory theory;
|
|
||||||
val exns = maps Task_Queue.group_status (Execution.peek exec_id);
|
|
||||||
in res :: map Exn.Exn exns end;
|
|
||||||
|
|
||||||
datatype task =
|
|
||||||
Task of string list * (theory list -> result) |
|
|
||||||
Finished of theory;
|
|
||||||
|
|
||||||
fun task_finished (Task _) = false
|
|
||||||
| task_finished (Finished _) = true;
|
|
||||||
|
|
||||||
fun task_parents deps (parents: string list) = map (the o AList.lookup (op =) deps) parents;
|
|
||||||
|
|
||||||
val schedule_seq =
|
|
||||||
String_Graph.schedule (fn deps => fn (_, task) =>
|
|
||||||
(case task of
|
|
||||||
Task (parents, body) =>
|
|
||||||
let
|
|
||||||
val result = body (task_parents deps parents);
|
|
||||||
val _ = Par_Exn.release_all (join_theory result);
|
|
||||||
val _ = result_present result ();
|
|
||||||
val _ = result_commit result ();
|
|
||||||
in result_theory result end
|
|
||||||
| Finished thy => thy)) #> ignore;
|
|
||||||
|
|
||||||
val schedule_futures = Thread_Attributes.uninterruptible (fn _ => fn tasks =>
|
|
||||||
let
|
|
||||||
val futures = tasks
|
|
||||||
|> String_Graph.schedule (fn deps => fn (name, task) =>
|
|
||||||
(case task of
|
|
||||||
Task (parents, body) =>
|
|
||||||
(singleton o Future.forks)
|
|
||||||
{name = "theory:" ^ name, group = NONE,
|
|
||||||
deps = map (Future.task_of o #2) deps, pri = 0, interrupts = true}
|
|
||||||
(fn () =>
|
|
||||||
(case filter (not o can Future.join o #2) deps of
|
|
||||||
[] => body (map (result_theory o Future.join) (task_parents deps parents))
|
|
||||||
| bad =>
|
|
||||||
error
|
|
||||||
("Failed to load theory " ^ quote name ^
|
|
||||||
" (unresolved " ^ commas_quote (map #1 bad) ^ ")")))
|
|
||||||
| Finished theory => Future.value (theory_result theory)));
|
|
||||||
|
|
||||||
val results1 = futures
|
|
||||||
|> maps (fn future =>
|
|
||||||
(case Future.join_result future of
|
|
||||||
Exn.Res result => join_theory result
|
|
||||||
| Exn.Exn exn => [Exn.Exn exn]));
|
|
||||||
|
|
||||||
val results2 = futures
|
|
||||||
|> map_filter (Exn.get_res o Future.join_result)
|
|
||||||
|> sort result_ord
|
|
||||||
|> Par_List.map (fn result => Exn.capture (result_present result) ());
|
|
||||||
|
|
||||||
(* FIXME more precise commit order (!?) *)
|
|
||||||
val results3 = futures
|
|
||||||
|> map (fn future => Exn.capture (fn () => result_commit (Future.join future) ()) ());
|
|
||||||
|
|
||||||
(* FIXME avoid global Execution.reset (!??) *)
|
|
||||||
val results4 = map Exn.Exn (maps Task_Queue.group_status (Execution.reset ()));
|
|
||||||
|
|
||||||
val _ = Par_Exn.release_all (results1 @ results2 @ results3 @ results4);
|
|
||||||
in () end);
|
|
||||||
|
|
||||||
|
|
||||||
(* eval theory *)
|
|
||||||
|
|
||||||
fun excursion keywords master_dir last_timing init elements =
|
|
||||||
let
|
|
||||||
fun prepare_span st span =
|
|
||||||
Command_Span.content span
|
|
||||||
|> Command.read keywords (Command.read_thy st) master_dir init ([], ~1)
|
|
||||||
|> (fn tr => Toplevel.put_timing (last_timing tr) tr);
|
|
||||||
|
|
||||||
fun element_result span_elem (st, _) =
|
|
||||||
let
|
|
||||||
val elem = Thy_Syntax.map_element (prepare_span st) span_elem;
|
|
||||||
val (results, st') = Toplevel.element_result keywords elem st;
|
|
||||||
val pos' = Toplevel.pos_of (Thy_Syntax.last_element elem);
|
|
||||||
in (results, (st', pos')) end;
|
|
||||||
|
|
||||||
val (results, (end_state, end_pos)) =
|
|
||||||
fold_map element_result elements (Toplevel.toplevel, Position.none);
|
|
||||||
|
|
||||||
val thy = Toplevel.end_theory end_pos end_state;
|
|
||||||
in (results, thy) end;
|
|
||||||
|
|
||||||
fun eval_thy (context: context) update_time master_dir header text_pos text parents =
|
|
||||||
let
|
|
||||||
val {options, symbols, bibtex_entries, last_timing} = context;
|
|
||||||
val (name, _) = #name header;
|
|
||||||
val keywords =
|
|
||||||
fold (curry Keyword.merge_keywords o Thy_Header.get_keywords) parents
|
|
||||||
(Keyword.add_keywords (#keywords header) Keyword.empty_keywords);
|
|
||||||
|
|
||||||
val _ = writeln "eval_thy 1";
|
|
||||||
|
|
||||||
val spans = Outer_Syntax.parse_spans (Token.explode keywords text_pos text);
|
|
||||||
val _ = writeln "eval_thy 2";
|
|
||||||
val elements = Thy_Syntax.parse_elements keywords spans;
|
|
||||||
val _ = writeln "eval_thy 3";
|
|
||||||
|
|
||||||
fun init () =
|
|
||||||
Resources.begin_theory master_dir header parents
|
|
||||||
|> Present.begin_theory bibtex_entries update_time
|
|
||||||
(fn () => implode (map (HTML.present_span symbols keywords) spans));
|
|
||||||
|
|
||||||
val (results, thy) =
|
|
||||||
cond_timeit true ("theory " ^ quote name)
|
|
||||||
(fn () => excursion keywords master_dir last_timing init elements);
|
|
||||||
|
|
||||||
fun present () =
|
|
||||||
let
|
|
||||||
val _ = writeln "eval_thy 4 - present"
|
|
||||||
val segments = (spans ~~ maps Toplevel.join_results results)
|
|
||||||
|> map (fn (span, (tr, st')) => {span = span, command = tr, state = st'});
|
|
||||||
|
|
||||||
val X = if null segments then (writeln "eval_thy 5 - no segments";[])
|
|
||||||
else List.concat(map (fn X => Command_Span.content (#span X)) segments)
|
|
||||||
val Y = (String.concatWith "::") (map Token.content_of X)
|
|
||||||
val _ = writeln("eval_thy 5 BEGIN\n"^Y^"eval_thy 6 END:"^Context.theory_name thy^"\n")
|
|
||||||
|
|
||||||
val context: presentation_context =
|
|
||||||
{options = options, file_pos = text_pos, adjust_pos = I, segments = segments};
|
|
||||||
in apply_presentation context thy end;
|
|
||||||
in (thy, present, size text) end;
|
|
||||||
|
|
||||||
|
|
||||||
(* require_thy -- checking database entries wrt. the file-system *)
|
|
||||||
|
|
||||||
local
|
|
||||||
|
|
||||||
fun required_by _ [] = ""
|
|
||||||
| required_by s initiators = s ^ "(required by " ^ show_path (rev initiators) ^ ")";
|
|
||||||
|
|
||||||
fun load_thy context initiators update_time deps text (name, pos) keywords parents =
|
|
||||||
let
|
|
||||||
val _ = remove_thy name;
|
|
||||||
val _ = writeln ("Loading theory " ^ quote name ^ required_by " " initiators);
|
|
||||||
val _ = Output.try_protocol_message (Markup.loading_theory name) [];
|
|
||||||
|
|
||||||
val {master = (thy_path, _), imports} = deps;
|
|
||||||
val dir = Path.dir thy_path;
|
|
||||||
val header = Thy_Header.make (name, pos) imports keywords;
|
|
||||||
|
|
||||||
val _ = Position.reports (map #2 imports ~~ map Theory.get_markup parents);
|
|
||||||
|
|
||||||
val exec_id = Document_ID.make ();
|
|
||||||
val _ =
|
|
||||||
Execution.running Document_ID.none exec_id [] orelse
|
|
||||||
raise Fail ("Failed to register execution: " ^ Document_ID.print exec_id);
|
|
||||||
|
|
||||||
val timing_start = Timing.start ();
|
|
||||||
|
|
||||||
val text_pos = Position.put_id (Document_ID.print exec_id) (Path.position thy_path);
|
|
||||||
val (theory, present, weight) =
|
|
||||||
eval_thy context update_time dir header text_pos text
|
|
||||||
(if name = Context.PureN then [Context.the_global_context ()] else parents);
|
|
||||||
|
|
||||||
val timing_result = Timing.result timing_start;
|
|
||||||
val timing_props = [Markup.theory_timing, (Markup.nameN, name)];
|
|
||||||
val _ = Output.try_protocol_message (timing_props @ Markup.timing_properties timing_result) []
|
|
||||||
|
|
||||||
fun commit () = update_thy deps theory;
|
|
||||||
in
|
|
||||||
Result {theory = theory, exec_id = exec_id, present = present, commit = commit, weight = weight}
|
|
||||||
end;
|
|
||||||
|
|
||||||
fun check_deps dir name =
|
|
||||||
(case lookup_deps name of
|
|
||||||
SOME NONE => (true, NONE, Position.none, get_imports name, [])
|
|
||||||
| NONE =>
|
|
||||||
let val {master, text, theory_pos, imports, keywords} = Resources.check_thy dir name
|
|
||||||
in (false, SOME (make_deps master imports, text), theory_pos, imports, keywords) end
|
|
||||||
| SOME (SOME {master, ...}) =>
|
|
||||||
let
|
|
||||||
val {master = master', text = text', theory_pos = theory_pos', imports = imports',
|
|
||||||
keywords = keywords'} = Resources.check_thy dir name;
|
|
||||||
val deps' = SOME (make_deps master' imports', text');
|
|
||||||
val current =
|
|
||||||
#2 master = #2 master' andalso
|
|
||||||
(case lookup_theory name of
|
|
||||||
NONE => false
|
|
||||||
| SOME theory => Resources.loaded_files_current theory);
|
|
||||||
in (current, deps', theory_pos', imports', keywords') end);
|
|
||||||
|
|
||||||
in
|
|
||||||
|
|
||||||
fun require_thys context initiators qualifier dir strs tasks =
|
|
||||||
fold_map (require_thy context initiators qualifier dir) strs tasks |>> forall I
|
|
||||||
and require_thy context initiators qualifier dir (s, require_pos) tasks =
|
|
||||||
let
|
|
||||||
val {master_dir, theory_name, ...} = Resources.import_name qualifier dir s;
|
|
||||||
in
|
|
||||||
(case try (String_Graph.get_node tasks) theory_name of
|
|
||||||
SOME task => (task_finished task, tasks)
|
|
||||||
| NONE =>
|
|
||||||
let
|
|
||||||
val _ = member (op =) initiators theory_name andalso error (cycle_msg initiators);
|
|
||||||
|
|
||||||
val (current, deps, theory_pos, imports, keywords) = check_deps master_dir theory_name
|
|
||||||
handle ERROR msg =>
|
|
||||||
cat_error msg
|
|
||||||
("The error(s) above occurred for theory " ^ quote theory_name ^
|
|
||||||
Position.here require_pos ^ required_by "\n" initiators);
|
|
||||||
|
|
||||||
val qualifier' = Resources.theory_qualifier theory_name;
|
|
||||||
val dir' = Path.append dir (master_dir_deps (Option.map #1 deps));
|
|
||||||
|
|
||||||
val parents = map (#theory_name o Resources.import_name qualifier' dir' o #1) imports;
|
|
||||||
val (parents_current, tasks') =
|
|
||||||
require_thys context (theory_name :: initiators) qualifier' dir' imports tasks;
|
|
||||||
|
|
||||||
val all_current = current andalso parents_current;
|
|
||||||
val task =
|
|
||||||
if all_current then Finished (get_theory theory_name)
|
|
||||||
else
|
|
||||||
(case deps of
|
|
||||||
NONE => raise Fail "Malformed deps"
|
|
||||||
| SOME (dep, text) =>
|
|
||||||
let
|
|
||||||
val update_time = serial ();
|
|
||||||
val load =
|
|
||||||
load_thy context initiators update_time
|
|
||||||
dep text (theory_name, theory_pos) keywords;
|
|
||||||
in Task (parents, load) end);
|
|
||||||
|
|
||||||
val tasks'' = new_entry theory_name parents task tasks';
|
|
||||||
in (all_current, tasks'') end)
|
|
||||||
end;
|
|
||||||
|
|
||||||
end;
|
|
||||||
|
|
||||||
|
|
||||||
(* use theories *)
|
|
||||||
|
|
||||||
fun use_theories context qualifier master_dir imports =
|
|
||||||
let val (_, tasks) = require_thys context [] qualifier master_dir imports String_Graph.empty
|
|
||||||
in if Multithreading.max_threads () > 1 then schedule_futures tasks else schedule_seq tasks end;
|
|
||||||
|
|
||||||
fun use_thy name =
|
|
||||||
use_theories (default_context ()) Resources.default_qualifier
|
|
||||||
Path.current [(name, Position.none)];
|
|
||||||
|
|
||||||
|
|
||||||
(* toplevel scripting -- without maintaining database *)
|
|
||||||
|
|
||||||
fun script_thy pos txt thy =
|
|
||||||
let
|
|
||||||
val trs =
|
|
||||||
Outer_Syntax.parse thy pos txt
|
|
||||||
|> map (Toplevel.modify_init (K thy));
|
|
||||||
val end_pos = if null trs then pos else Toplevel.pos_of (List.last trs);
|
|
||||||
val end_state = fold (Toplevel.command_exception true) trs Toplevel.toplevel;
|
|
||||||
in Toplevel.end_theory end_pos end_state end;
|
|
||||||
|
|
||||||
|
|
||||||
(* register theory *)
|
|
||||||
|
|
||||||
fun register_thy theory =
|
|
||||||
let
|
|
||||||
val name = Context.theory_long_name theory;
|
|
||||||
val {master, ...} = Resources.check_thy (Resources.master_directory theory) name;
|
|
||||||
val imports = Resources.imports_of theory;
|
|
||||||
in
|
|
||||||
change_thys (fn thys =>
|
|
||||||
let
|
|
||||||
val thys' = remove name thys;
|
|
||||||
val _ = writeln ("Registering theory " ^ quote name);
|
|
||||||
in update (make_deps master imports) theory thys' end)
|
|
||||||
end;
|
|
||||||
|
|
||||||
|
|
||||||
(* finish all theories *)
|
|
||||||
|
|
||||||
fun finish () = change_thys (String_Graph.map (fn _ => fn (_, entry) => (NONE, entry)));
|
|
||||||
|
|
||||||
end;
|
|
||||||
|
|
||||||
fun use_thy name = Runtime.toplevel_program (fn () => Thy_Info.use_thy name);
|
|
|
@ -1,484 +0,0 @@
|
||||||
(* Title: Pure/Thy/thy_info.ML
|
|
||||||
Author: Markus Wenzel, TU Muenchen
|
|
||||||
|
|
||||||
Global theory info database, with auto-loading according to theory and
|
|
||||||
file dependencies.
|
|
||||||
*)
|
|
||||||
|
|
||||||
signature THY_INFO =
|
|
||||||
sig
|
|
||||||
type presentation_context =
|
|
||||||
{options: Options.T, file_pos: Position.T, adjust_pos: Position.T -> Position.T,
|
|
||||||
segments: Thy_Output.segment list}
|
|
||||||
val apply_presentation: presentation_context -> theory -> unit
|
|
||||||
val add_presentation: (presentation_context -> theory -> unit) -> theory -> theory
|
|
||||||
val get_names: unit -> string list
|
|
||||||
val lookup_theory: string -> theory option
|
|
||||||
val get_theory: string -> theory
|
|
||||||
val master_directory: string -> Path.T
|
|
||||||
val remove_thy: string -> unit
|
|
||||||
type context =
|
|
||||||
{options: Options.T,
|
|
||||||
symbols: HTML.symbols,
|
|
||||||
bibtex_entries: string list,
|
|
||||||
last_timing: Toplevel.transition -> Time.time}
|
|
||||||
val use_theories: context -> string -> Path.T -> (string * Position.T) list -> unit
|
|
||||||
val use_thy: string -> unit
|
|
||||||
val script_thy: Position.T -> string -> theory -> theory
|
|
||||||
val register_thy: theory -> unit
|
|
||||||
val finish: unit -> unit
|
|
||||||
end;
|
|
||||||
|
|
||||||
structure Thy_Info: THY_INFO =
|
|
||||||
struct
|
|
||||||
|
|
||||||
(** presentation of consolidated theory **)
|
|
||||||
|
|
||||||
type presentation_context =
|
|
||||||
{options: Options.T, file_pos: Position.T, adjust_pos: Position.T -> Position.T,
|
|
||||||
segments: Thy_Output.segment list};
|
|
||||||
|
|
||||||
structure Presentation = Theory_Data
|
|
||||||
(
|
|
||||||
type T = ((presentation_context -> theory -> unit) * stamp) list;
|
|
||||||
val empty = [];
|
|
||||||
val extend = I;
|
|
||||||
fun merge data : T = Library.merge (eq_snd op =) data;
|
|
||||||
);
|
|
||||||
|
|
||||||
fun apply_presentation (context: presentation_context) thy =
|
|
||||||
ignore (Presentation.get thy |> Par_List.map (fn (f, _) => f context thy));
|
|
||||||
|
|
||||||
fun add_presentation f = Presentation.map (cons (f, stamp ()));
|
|
||||||
|
|
||||||
val _ =
|
|
||||||
Theory.setup (add_presentation (fn {options, file_pos, segments, ...} => fn thy =>
|
|
||||||
if exists (Toplevel.is_skipped_proof o #state) segments then ()
|
|
||||||
else
|
|
||||||
let
|
|
||||||
val body = Thy_Output.present_thy options thy segments;
|
|
||||||
val option = Present.document_option options;
|
|
||||||
in
|
|
||||||
if #disabled option then ()
|
|
||||||
else
|
|
||||||
let
|
|
||||||
val latex = Latex.isabelle_body (Context.theory_name thy) body;
|
|
||||||
val output = [Latex.output_text latex, Latex.output_positions file_pos latex];
|
|
||||||
val _ =
|
|
||||||
if Options.bool options "export_document"
|
|
||||||
then Export.export thy "document.tex" output else ();
|
|
||||||
val _ = if #enabled option then Present.theory_output thy output else ();
|
|
||||||
in () end
|
|
||||||
end));
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(** thy database **)
|
|
||||||
|
|
||||||
(* messages *)
|
|
||||||
|
|
||||||
val show_path = space_implode " via " o map quote;
|
|
||||||
|
|
||||||
fun cycle_msg names = "Cyclic dependency of " ^ show_path names;
|
|
||||||
|
|
||||||
|
|
||||||
(* derived graph operations *)
|
|
||||||
|
|
||||||
fun add_deps name parents G = String_Graph.add_deps_acyclic (name, parents) G
|
|
||||||
handle String_Graph.CYCLES namess => error (cat_lines (map cycle_msg namess));
|
|
||||||
|
|
||||||
fun new_entry name parents entry =
|
|
||||||
String_Graph.new_node (name, entry) #> add_deps name parents;
|
|
||||||
|
|
||||||
|
|
||||||
(* global thys *)
|
|
||||||
|
|
||||||
type deps =
|
|
||||||
{master: (Path.T * SHA1.digest), (*master dependencies for thy file*)
|
|
||||||
imports: (string * Position.T) list}; (*source specification of imports (partially qualified)*)
|
|
||||||
|
|
||||||
fun make_deps master imports : deps = {master = master, imports = imports};
|
|
||||||
|
|
||||||
fun master_dir_deps (d: deps option) =
|
|
||||||
the_default Path.current (Option.map (Path.dir o #1 o #master) d);
|
|
||||||
|
|
||||||
local
|
|
||||||
val global_thys =
|
|
||||||
Synchronized.var "Thy_Info.thys"
|
|
||||||
(String_Graph.empty: (deps option * theory option) String_Graph.T);
|
|
||||||
in
|
|
||||||
fun get_thys () = Synchronized.value global_thys;
|
|
||||||
fun change_thys f = Synchronized.change global_thys f;
|
|
||||||
end;
|
|
||||||
|
|
||||||
fun get_names () = String_Graph.topological_order (get_thys ());
|
|
||||||
|
|
||||||
|
|
||||||
(* access thy *)
|
|
||||||
|
|
||||||
fun lookup thys name = try (String_Graph.get_node thys) name;
|
|
||||||
fun lookup_thy name = lookup (get_thys ()) name;
|
|
||||||
|
|
||||||
fun get thys name =
|
|
||||||
(case lookup thys name of
|
|
||||||
SOME thy => thy
|
|
||||||
| NONE => error ("Theory loader: nothing known about theory " ^ quote name));
|
|
||||||
|
|
||||||
fun get_thy name = get (get_thys ()) name;
|
|
||||||
|
|
||||||
|
|
||||||
(* access deps *)
|
|
||||||
|
|
||||||
val lookup_deps = Option.map #1 o lookup_thy;
|
|
||||||
|
|
||||||
val master_directory = master_dir_deps o #1 o get_thy;
|
|
||||||
|
|
||||||
|
|
||||||
(* access theory *)
|
|
||||||
|
|
||||||
fun lookup_theory name =
|
|
||||||
(case lookup_thy name of
|
|
||||||
SOME (_, SOME theory) => SOME theory
|
|
||||||
| _ => NONE);
|
|
||||||
|
|
||||||
fun get_theory name =
|
|
||||||
(case lookup_theory name of
|
|
||||||
SOME theory => theory
|
|
||||||
| _ => error ("Theory loader: undefined entry for theory " ^ quote name));
|
|
||||||
|
|
||||||
val get_imports = Resources.imports_of o get_theory;
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(** thy operations **)
|
|
||||||
|
|
||||||
(* remove *)
|
|
||||||
|
|
||||||
fun remove name thys =
|
|
||||||
(case lookup thys name of
|
|
||||||
NONE => thys
|
|
||||||
| SOME (NONE, _) => error ("Cannot update finished theory " ^ quote name)
|
|
||||||
| SOME _ =>
|
|
||||||
let
|
|
||||||
val succs = String_Graph.all_succs thys [name];
|
|
||||||
val _ = writeln ("Theory loader: removing " ^ commas_quote succs);
|
|
||||||
in fold String_Graph.del_node succs thys end);
|
|
||||||
|
|
||||||
val remove_thy = change_thys o remove;
|
|
||||||
|
|
||||||
|
|
||||||
(* update *)
|
|
||||||
|
|
||||||
fun update deps theory thys =
|
|
||||||
let
|
|
||||||
val name = Context.theory_long_name theory;
|
|
||||||
val parents = map Context.theory_long_name (Theory.parents_of theory);
|
|
||||||
|
|
||||||
val thys' = remove name thys;
|
|
||||||
val _ = map (get thys') parents;
|
|
||||||
in new_entry name parents (SOME deps, SOME theory) thys' end;
|
|
||||||
|
|
||||||
fun update_thy deps theory = change_thys (update deps theory);
|
|
||||||
|
|
||||||
|
|
||||||
(* context *)
|
|
||||||
|
|
||||||
type context =
|
|
||||||
{options: Options.T,
|
|
||||||
symbols: HTML.symbols,
|
|
||||||
bibtex_entries: string list,
|
|
||||||
last_timing: Toplevel.transition -> Time.time};
|
|
||||||
|
|
||||||
fun default_context (): context =
|
|
||||||
{options = Options.default (),
|
|
||||||
symbols = HTML.no_symbols,
|
|
||||||
bibtex_entries = [],
|
|
||||||
last_timing = K Time.zeroTime};
|
|
||||||
|
|
||||||
|
|
||||||
(* scheduling loader tasks *)
|
|
||||||
|
|
||||||
datatype result =
|
|
||||||
Result of {theory: theory, exec_id: Document_ID.exec,
|
|
||||||
present: unit -> unit, commit: unit -> unit, weight: int};
|
|
||||||
|
|
||||||
fun theory_result theory =
|
|
||||||
Result {theory = theory, exec_id = Document_ID.none, present = I, commit = I, weight = 0};
|
|
||||||
|
|
||||||
fun result_theory (Result {theory, ...}) = theory;
|
|
||||||
fun result_present (Result {present, ...}) = present;
|
|
||||||
fun result_commit (Result {commit, ...}) = commit;
|
|
||||||
fun result_ord (Result {weight = i, ...}, Result {weight = j, ...}) = int_ord (j, i);
|
|
||||||
|
|
||||||
fun join_theory (Result {theory, exec_id, ...}) =
|
|
||||||
let
|
|
||||||
val _ = Execution.join [exec_id];
|
|
||||||
val res = Exn.capture Thm.consolidate_theory theory;
|
|
||||||
val exns = maps Task_Queue.group_status (Execution.peek exec_id);
|
|
||||||
in res :: map Exn.Exn exns end;
|
|
||||||
|
|
||||||
datatype task =
|
|
||||||
Task of string list * (theory list -> result) |
|
|
||||||
Finished of theory;
|
|
||||||
|
|
||||||
fun task_finished (Task _) = false
|
|
||||||
| task_finished (Finished _) = true;
|
|
||||||
|
|
||||||
fun task_parents deps (parents: string list) = map (the o AList.lookup (op =) deps) parents;
|
|
||||||
|
|
||||||
val schedule_seq =
|
|
||||||
String_Graph.schedule (fn deps => fn (_, task) =>
|
|
||||||
(case task of
|
|
||||||
Task (parents, body) =>
|
|
||||||
let
|
|
||||||
val result = body (task_parents deps parents);
|
|
||||||
val _ = Par_Exn.release_all (join_theory result);
|
|
||||||
val _ = result_present result ();
|
|
||||||
val _ = result_commit result ();
|
|
||||||
in result_theory result end
|
|
||||||
| Finished thy => thy)) #> ignore;
|
|
||||||
|
|
||||||
val schedule_futures = Thread_Attributes.uninterruptible (fn _ => fn tasks =>
|
|
||||||
let
|
|
||||||
val futures = tasks
|
|
||||||
|> String_Graph.schedule (fn deps => fn (name, task) =>
|
|
||||||
(case task of
|
|
||||||
Task (parents, body) =>
|
|
||||||
(singleton o Future.forks)
|
|
||||||
{name = "theory:" ^ name, group = NONE,
|
|
||||||
deps = map (Future.task_of o #2) deps, pri = 0, interrupts = true}
|
|
||||||
(fn () =>
|
|
||||||
(case filter (not o can Future.join o #2) deps of
|
|
||||||
[] => body (map (result_theory o Future.join) (task_parents deps parents))
|
|
||||||
| bad =>
|
|
||||||
error
|
|
||||||
("Failed to load theory " ^ quote name ^
|
|
||||||
" (unresolved " ^ commas_quote (map #1 bad) ^ ")")))
|
|
||||||
| Finished theory => Future.value (theory_result theory)));
|
|
||||||
|
|
||||||
val results1 = futures
|
|
||||||
|> maps (fn future =>
|
|
||||||
(case Future.join_result future of
|
|
||||||
Exn.Res result => join_theory result
|
|
||||||
| Exn.Exn exn => [Exn.Exn exn]));
|
|
||||||
|
|
||||||
val results2 = futures
|
|
||||||
|> map_filter (Exn.get_res o Future.join_result)
|
|
||||||
|> sort result_ord
|
|
||||||
|> Par_List.map (fn result => Exn.capture (result_present result) ());
|
|
||||||
|
|
||||||
(* FIXME more precise commit order (!?) *)
|
|
||||||
val results3 = futures
|
|
||||||
|> map (fn future => Exn.capture (fn () => result_commit (Future.join future) ()) ());
|
|
||||||
|
|
||||||
(* FIXME avoid global Execution.reset (!??) *)
|
|
||||||
val results4 = map Exn.Exn (maps Task_Queue.group_status (Execution.reset ()));
|
|
||||||
|
|
||||||
val _ = Par_Exn.release_all (results1 @ results2 @ results3 @ results4);
|
|
||||||
in () end);
|
|
||||||
|
|
||||||
|
|
||||||
(* eval theory *)
|
|
||||||
|
|
||||||
fun excursion keywords master_dir last_timing init elements =
|
|
||||||
let
|
|
||||||
fun prepare_span st span =
|
|
||||||
Command_Span.content span
|
|
||||||
|> Command.read keywords (Command.read_thy st) master_dir init ([], ~1)
|
|
||||||
|> (fn tr => Toplevel.put_timing (last_timing tr) tr);
|
|
||||||
|
|
||||||
fun element_result span_elem (st, _) =
|
|
||||||
let
|
|
||||||
val elem = Thy_Syntax.map_element (prepare_span st) span_elem;
|
|
||||||
val (results, st') = Toplevel.element_result keywords elem st;
|
|
||||||
val pos' = Toplevel.pos_of (Thy_Syntax.last_element elem);
|
|
||||||
in (results, (st', pos')) end;
|
|
||||||
|
|
||||||
val (results, (end_state, end_pos)) =
|
|
||||||
fold_map element_result elements (Toplevel.toplevel, Position.none);
|
|
||||||
|
|
||||||
val thy = Toplevel.end_theory end_pos end_state;
|
|
||||||
in (results, thy) end;
|
|
||||||
|
|
||||||
fun eval_thy (context: context) update_time master_dir header text_pos text parents =
|
|
||||||
let
|
|
||||||
val {options, symbols, bibtex_entries, last_timing} = context;
|
|
||||||
val (name, _) = #name header;
|
|
||||||
val keywords =
|
|
||||||
fold (curry Keyword.merge_keywords o Thy_Header.get_keywords) parents
|
|
||||||
(Keyword.add_keywords (#keywords header) Keyword.empty_keywords);
|
|
||||||
|
|
||||||
val spans = Outer_Syntax.parse_spans (Token.explode keywords text_pos text);
|
|
||||||
val elements = Thy_Syntax.parse_elements keywords spans;
|
|
||||||
|
|
||||||
fun init () =
|
|
||||||
Resources.begin_theory master_dir header parents
|
|
||||||
|> Present.begin_theory bibtex_entries update_time
|
|
||||||
(fn () => implode (map (HTML.present_span symbols keywords) spans));
|
|
||||||
|
|
||||||
val (results, thy) =
|
|
||||||
cond_timeit true ("theory " ^ quote name)
|
|
||||||
(fn () => excursion keywords master_dir last_timing init elements);
|
|
||||||
|
|
||||||
fun present () =
|
|
||||||
let
|
|
||||||
val segments = (spans ~~ maps Toplevel.join_results results)
|
|
||||||
|> map (fn (span, (tr, st')) => {span = span, command = tr, state = st'});
|
|
||||||
val context: presentation_context =
|
|
||||||
{options = options, file_pos = text_pos, adjust_pos = I, segments = segments};
|
|
||||||
in apply_presentation context thy end;
|
|
||||||
in (thy, present, size text) end;
|
|
||||||
|
|
||||||
|
|
||||||
(* require_thy -- checking database entries wrt. the file-system *)
|
|
||||||
|
|
||||||
local
|
|
||||||
|
|
||||||
fun required_by _ [] = ""
|
|
||||||
| required_by s initiators = s ^ "(required by " ^ show_path (rev initiators) ^ ")";
|
|
||||||
|
|
||||||
fun load_thy context initiators update_time deps text (name, pos) keywords parents =
|
|
||||||
let
|
|
||||||
val _ = remove_thy name;
|
|
||||||
val _ = writeln ("Loading theory " ^ quote name ^ required_by " " initiators);
|
|
||||||
val _ = Output.try_protocol_message (Markup.loading_theory name) [];
|
|
||||||
|
|
||||||
val {master = (thy_path, _), imports} = deps;
|
|
||||||
val dir = Path.dir thy_path;
|
|
||||||
val header = Thy_Header.make (name, pos) imports keywords;
|
|
||||||
|
|
||||||
val _ = Position.reports (map #2 imports ~~ map Theory.get_markup parents);
|
|
||||||
|
|
||||||
val exec_id = Document_ID.make ();
|
|
||||||
val _ =
|
|
||||||
Execution.running Document_ID.none exec_id [] orelse
|
|
||||||
raise Fail ("Failed to register execution: " ^ Document_ID.print exec_id);
|
|
||||||
|
|
||||||
val timing_start = Timing.start ();
|
|
||||||
|
|
||||||
val text_pos = Position.put_id (Document_ID.print exec_id) (Path.position thy_path);
|
|
||||||
val (theory, present, weight) =
|
|
||||||
eval_thy context update_time dir header text_pos text
|
|
||||||
(if name = Context.PureN then [Context.the_global_context ()] else parents);
|
|
||||||
|
|
||||||
val timing_result = Timing.result timing_start;
|
|
||||||
val timing_props = [Markup.theory_timing, (Markup.nameN, name)];
|
|
||||||
val _ = Output.try_protocol_message (timing_props @ Markup.timing_properties timing_result) []
|
|
||||||
|
|
||||||
fun commit () = update_thy deps theory;
|
|
||||||
in
|
|
||||||
Result {theory = theory, exec_id = exec_id, present = present, commit = commit, weight = weight}
|
|
||||||
end;
|
|
||||||
|
|
||||||
fun check_deps dir name =
|
|
||||||
(case lookup_deps name of
|
|
||||||
SOME NONE => (true, NONE, Position.none, get_imports name, [])
|
|
||||||
| NONE =>
|
|
||||||
let val {master, text, theory_pos, imports, keywords} = Resources.check_thy dir name
|
|
||||||
in (false, SOME (make_deps master imports, text), theory_pos, imports, keywords) end
|
|
||||||
| SOME (SOME {master, ...}) =>
|
|
||||||
let
|
|
||||||
val {master = master', text = text', theory_pos = theory_pos', imports = imports',
|
|
||||||
keywords = keywords'} = Resources.check_thy dir name;
|
|
||||||
val deps' = SOME (make_deps master' imports', text');
|
|
||||||
val current =
|
|
||||||
#2 master = #2 master' andalso
|
|
||||||
(case lookup_theory name of
|
|
||||||
NONE => false
|
|
||||||
| SOME theory => Resources.loaded_files_current theory);
|
|
||||||
in (current, deps', theory_pos', imports', keywords') end);
|
|
||||||
|
|
||||||
in
|
|
||||||
|
|
||||||
fun require_thys context initiators qualifier dir strs tasks =
|
|
||||||
fold_map (require_thy context initiators qualifier dir) strs tasks |>> forall I
|
|
||||||
and require_thy context initiators qualifier dir (s, require_pos) tasks =
|
|
||||||
let
|
|
||||||
val {master_dir, theory_name, ...} = Resources.import_name qualifier dir s;
|
|
||||||
in
|
|
||||||
(case try (String_Graph.get_node tasks) theory_name of
|
|
||||||
SOME task => (task_finished task, tasks)
|
|
||||||
| NONE =>
|
|
||||||
let
|
|
||||||
val _ = member (op =) initiators theory_name andalso error (cycle_msg initiators);
|
|
||||||
|
|
||||||
val (current, deps, theory_pos, imports, keywords) = check_deps master_dir theory_name
|
|
||||||
handle ERROR msg =>
|
|
||||||
cat_error msg
|
|
||||||
("The error(s) above occurred for theory " ^ quote theory_name ^
|
|
||||||
Position.here require_pos ^ required_by "\n" initiators);
|
|
||||||
|
|
||||||
val qualifier' = Resources.theory_qualifier theory_name;
|
|
||||||
val dir' = Path.append dir (master_dir_deps (Option.map #1 deps));
|
|
||||||
|
|
||||||
val parents = map (#theory_name o Resources.import_name qualifier' dir' o #1) imports;
|
|
||||||
val (parents_current, tasks') =
|
|
||||||
require_thys context (theory_name :: initiators) qualifier' dir' imports tasks;
|
|
||||||
|
|
||||||
val all_current = current andalso parents_current;
|
|
||||||
val task =
|
|
||||||
if all_current then Finished (get_theory theory_name)
|
|
||||||
else
|
|
||||||
(case deps of
|
|
||||||
NONE => raise Fail "Malformed deps"
|
|
||||||
| SOME (dep, text) =>
|
|
||||||
let
|
|
||||||
val update_time = serial ();
|
|
||||||
val load =
|
|
||||||
load_thy context initiators update_time
|
|
||||||
dep text (theory_name, theory_pos) keywords;
|
|
||||||
in Task (parents, load) end);
|
|
||||||
|
|
||||||
val tasks'' = new_entry theory_name parents task tasks';
|
|
||||||
in (all_current, tasks'') end)
|
|
||||||
end;
|
|
||||||
|
|
||||||
end;
|
|
||||||
|
|
||||||
|
|
||||||
(* use theories *)
|
|
||||||
|
|
||||||
fun use_theories context qualifier master_dir imports =
|
|
||||||
let val (_, tasks) = require_thys context [] qualifier master_dir imports String_Graph.empty
|
|
||||||
in if Multithreading.max_threads () > 1 then schedule_futures tasks else schedule_seq tasks end;
|
|
||||||
|
|
||||||
fun use_thy name =
|
|
||||||
use_theories (default_context ()) Resources.default_qualifier
|
|
||||||
Path.current [(name, Position.none)];
|
|
||||||
|
|
||||||
|
|
||||||
(* toplevel scripting -- without maintaining database *)
|
|
||||||
|
|
||||||
fun script_thy pos txt thy =
|
|
||||||
let
|
|
||||||
val trs =
|
|
||||||
Outer_Syntax.parse thy pos txt
|
|
||||||
|> map (Toplevel.modify_init (K thy));
|
|
||||||
val end_pos = if null trs then pos else Toplevel.pos_of (List.last trs);
|
|
||||||
val end_state = fold (Toplevel.command_exception true) trs Toplevel.toplevel;
|
|
||||||
in Toplevel.end_theory end_pos end_state end;
|
|
||||||
|
|
||||||
|
|
||||||
(* register theory *)
|
|
||||||
|
|
||||||
fun register_thy theory =
|
|
||||||
let
|
|
||||||
val name = Context.theory_long_name theory;
|
|
||||||
val {master, ...} = Resources.check_thy (Resources.master_directory theory) name;
|
|
||||||
val imports = Resources.imports_of theory;
|
|
||||||
in
|
|
||||||
change_thys (fn thys =>
|
|
||||||
let
|
|
||||||
val thys' = remove name thys;
|
|
||||||
val _ = writeln ("Registering theory " ^ quote name);
|
|
||||||
in update (make_deps master imports) theory thys' end)
|
|
||||||
end;
|
|
||||||
|
|
||||||
|
|
||||||
(* finish all theories *)
|
|
||||||
|
|
||||||
fun finish () = change_thys (String_Graph.map (fn _ => fn (_, entry) => (NONE, entry)));
|
|
||||||
|
|
||||||
end;
|
|
||||||
|
|
||||||
fun use_thy name = Runtime.toplevel_program (fn () => Thy_Info.use_thy name);
|
|
|
@ -175,30 +175,30 @@ end;
|
||||||
(* presentation tokens *)
|
(* presentation tokens *)
|
||||||
|
|
||||||
datatype token =
|
datatype token =
|
||||||
Ignore_Token
|
Ignore
|
||||||
| Basic_Token of Token.T
|
| Token of Token.T
|
||||||
| Markup_Token of string * Input.source
|
| Heading of string * Input.source
|
||||||
| Markup_Env_Token of string * Input.source
|
| Body of string * Input.source
|
||||||
| Raw_Token of Input.source;
|
| Raw of Input.source;
|
||||||
|
|
||||||
fun basic_token pred (Basic_Token tok) = pred tok
|
fun token_with pred (Token tok) = pred tok
|
||||||
| basic_token _ _ = false;
|
| token_with _ _ = false;
|
||||||
|
|
||||||
val white_token = basic_token Document_Source.is_white;
|
val white_token = token_with Document_Source.is_white;
|
||||||
val white_comment_token = basic_token Document_Source.is_white_comment;
|
val white_comment_token = token_with Document_Source.is_white_comment;
|
||||||
val blank_token = basic_token Token.is_blank;
|
val blank_token = token_with Token.is_blank;
|
||||||
val newline_token = basic_token Token.is_newline;
|
val newline_token = token_with Token.is_newline;
|
||||||
|
|
||||||
fun present_token ctxt tok =
|
fun present_token ctxt tok =
|
||||||
(case tok of
|
(case tok of
|
||||||
Ignore_Token => []
|
Ignore => []
|
||||||
| Basic_Token tok => output_token ctxt tok
|
| Token tok => output_token ctxt tok
|
||||||
| Markup_Token (cmd, source) =>
|
| Heading (cmd, source) =>
|
||||||
Latex.enclose_body ("%\n\\isamarkup" ^ cmd ^ "{") "%\n}\n"
|
Latex.enclose_body ("%\n\\isamarkup" ^ cmd ^ "{") "%\n}\n"
|
||||||
(output_document ctxt {markdown = false} source)
|
(output_document ctxt {markdown = false} source)
|
||||||
| Markup_Env_Token (cmd, source) =>
|
| Body (cmd, source) =>
|
||||||
[Latex.environment_block ("isamarkup" ^ cmd) (output_document ctxt {markdown = true} source)]
|
[Latex.environment_block ("isamarkup" ^ cmd) (output_document ctxt {markdown = true} source)]
|
||||||
| Raw_Token source =>
|
| Raw source =>
|
||||||
Latex.string "%\n" :: output_document ctxt {markdown = true} source @ [Latex.string "\n"]);
|
Latex.string "%\n" :: output_document ctxt {markdown = true} source @ [Latex.string "\n"]);
|
||||||
|
|
||||||
|
|
||||||
|
@ -367,7 +367,7 @@ fun present_thy options thy (segments: segment list) =
|
||||||
(* tokens *)
|
(* tokens *)
|
||||||
|
|
||||||
val ignored = Scan.state --| ignore
|
val ignored = Scan.state --| ignore
|
||||||
>> (fn d => (NONE, (Ignore_Token, ("", d))));
|
>> (fn d => (NONE, (Ignore, ("", d))));
|
||||||
|
|
||||||
fun markup pred mk flag = Scan.peek (fn d =>
|
fun markup pred mk flag = Scan.peek (fn d =>
|
||||||
Document_Source.improper |--
|
Document_Source.improper |--
|
||||||
|
@ -384,29 +384,29 @@ fun present_thy options thy (segments: segment list) =
|
||||||
Scan.optional (Scan.one Token.is_command_modifier ::: Document_Source.improper) [] --
|
Scan.optional (Scan.one Token.is_command_modifier ::: Document_Source.improper) [] --
|
||||||
Scan.one Token.is_command --| Document_Source.annotation
|
Scan.one Token.is_command --| Document_Source.annotation
|
||||||
>> (fn (cmd_mod, cmd) =>
|
>> (fn (cmd_mod, cmd) =>
|
||||||
map (fn tok => (NONE, (Basic_Token tok, ("", d)))) cmd_mod @
|
map (fn tok => (NONE, (Token tok, ("", d)))) cmd_mod @
|
||||||
[(SOME (Token.content_of cmd, Token.pos_of cmd),
|
[(SOME (Token.content_of cmd, Token.pos_of cmd),
|
||||||
(Basic_Token cmd, (markup_false, d)))]));
|
(Token cmd, (markup_false, d)))]));
|
||||||
|
|
||||||
val cmt = Scan.peek (fn d =>
|
val cmt = Scan.peek (fn d =>
|
||||||
Scan.one Document_Source.is_black_comment >> (fn tok => (NONE, (Basic_Token tok, ("", d)))));
|
Scan.one Document_Source.is_black_comment >> (fn tok => (NONE, (Token tok, ("", d)))));
|
||||||
|
|
||||||
val other = Scan.peek (fn d =>
|
val other = Scan.peek (fn d =>
|
||||||
Parse.not_eof >> (fn tok => (NONE, (Basic_Token tok, ("", d)))));
|
Parse.not_eof >> (fn tok => (NONE, (Token tok, ("", d)))));
|
||||||
|
|
||||||
val tokens =
|
val tokens =
|
||||||
(ignored ||
|
(ignored ||
|
||||||
markup Keyword.is_document_heading Markup_Token markup_true ||
|
markup Keyword.is_document_heading Heading markup_true ||
|
||||||
markup Keyword.is_document_body Markup_Env_Token markup_true ||
|
markup Keyword.is_document_body Body markup_true ||
|
||||||
markup Keyword.is_document_raw (Raw_Token o #2) "") >> single ||
|
markup Keyword.is_document_raw (Raw o #2) "") >> single ||
|
||||||
command ||
|
command ||
|
||||||
(cmt || other) >> single;
|
(cmt || other) >> single;
|
||||||
|
|
||||||
|
|
||||||
(* spans *)
|
(* spans *)
|
||||||
|
|
||||||
val is_eof = fn (_, (Basic_Token x, _)) => Token.is_eof x | _ => false;
|
val is_eof = fn (_, (Token x, _)) => Token.is_eof x | _ => false;
|
||||||
val stopper = Scan.stopper (K (NONE, (Basic_Token Token.eof, ("", 0)))) is_eof;
|
val stopper = Scan.stopper (K (NONE, (Token Token.eof, ("", 0)))) is_eof;
|
||||||
|
|
||||||
val cmd = Scan.one (is_some o fst);
|
val cmd = Scan.one (is_some o fst);
|
||||||
val non_cmd = Scan.one (is_none o fst andf not o is_eof) >> #2;
|
val non_cmd = Scan.one (is_none o fst andf not o is_eof) >> #2;
|
||||||
|
@ -466,7 +466,7 @@ end;
|
||||||
(* pretty printing *)
|
(* pretty printing *)
|
||||||
|
|
||||||
fun pretty_term ctxt t =
|
fun pretty_term ctxt t =
|
||||||
Syntax.pretty_term (Variable.auto_fixes t ctxt) t;
|
Syntax.pretty_term (Proof_Context.augment t ctxt) t;
|
||||||
|
|
||||||
fun pretty_thm ctxt = pretty_term ctxt o Thm.full_prop_of;
|
fun pretty_thm ctxt = pretty_term ctxt o Thm.full_prop_of;
|
||||||
|
|
|
@ -2,6 +2,8 @@
|
||||||
Author: Makarius
|
Author: Makarius
|
||||||
|
|
||||||
Theory document output.
|
Theory document output.
|
||||||
|
|
||||||
|
This is a modified/patched version that supports Isabelle/DOF.
|
||||||
*)
|
*)
|
||||||
|
|
||||||
signature THY_OUTPUT =
|
signature THY_OUTPUT =
|
||||||
|
@ -176,32 +178,32 @@ end;
|
||||||
(* presentation tokens *)
|
(* presentation tokens *)
|
||||||
|
|
||||||
datatype token =
|
datatype token =
|
||||||
Ignore_Token
|
Ignore
|
||||||
| Basic_Token of Token.T
|
| Token of Token.T
|
||||||
| Markup_Token of string * string * Input.source
|
| Heading of string * string * Input.source
|
||||||
| Markup_Env_Token of string * string * Input.source
|
| Body of string * string * Input.source
|
||||||
| Raw_Token of Input.source;
|
| Raw of Input.source;
|
||||||
|
|
||||||
fun basic_token pred (Basic_Token tok) = pred tok
|
fun token_with pred (Token tok) = pred tok
|
||||||
| basic_token _ _ = false;
|
| token_with _ _ = false;
|
||||||
|
|
||||||
val white_token = basic_token Document_Source.is_white;
|
val white_token = token_with Document_Source.is_white;
|
||||||
val white_comment_token = basic_token Document_Source.is_white_comment;
|
val white_comment_token = token_with Document_Source.is_white_comment;
|
||||||
val blank_token = basic_token Token.is_blank;
|
val blank_token = token_with Token.is_blank;
|
||||||
val newline_token = basic_token Token.is_newline;
|
val newline_token = token_with Token.is_newline;
|
||||||
|
|
||||||
fun present_token ctxt tok =
|
fun present_token ctxt tok =
|
||||||
(case tok of
|
(case tok of
|
||||||
Ignore_Token => []
|
Ignore => []
|
||||||
| Basic_Token tok => output_token ctxt tok
|
| Token tok => output_token ctxt tok
|
||||||
| Markup_Token (cmd, meta_args, source) =>
|
| Heading (cmd, meta_args, source) =>
|
||||||
Latex.enclose_body ("%\n\\isamarkup" ^ cmd (* ^ meta_args *) ^ "{") "%\n}\n"
|
Latex.enclose_body ("%\n\\isamarkup" ^ cmd (* ^ meta_args *) ^ "{") "%\n}\n"
|
||||||
(output_document ctxt {markdown = false} source)
|
(output_document ctxt {markdown = false} source)
|
||||||
| Markup_Env_Token (cmd, meta_args, source) =>
|
| Body (cmd, meta_args, source) =>
|
||||||
[Latex.environment_block
|
[Latex.environment_block
|
||||||
("isamarkup" ^ cmd (* ^ meta_args*))
|
("isamarkup" ^ cmd (* ^ meta_args*))
|
||||||
(Latex.string meta_args :: output_document ctxt {markdown = true} source)]
|
(Latex.string meta_args :: output_document ctxt {markdown = true} source)]
|
||||||
| Raw_Token source =>
|
| Raw source =>
|
||||||
Latex.string "%\n" :: output_document ctxt {markdown = true} source @ [Latex.string "\n"]);
|
Latex.string "%\n" :: output_document ctxt {markdown = true} source @ [Latex.string "\n"]);
|
||||||
|
|
||||||
|
|
||||||
|
@ -376,7 +378,7 @@ fun present_thy options thy (segments: segment list) =
|
||||||
(* tokens *)
|
(* tokens *)
|
||||||
|
|
||||||
val ignored = Scan.state --| ignore
|
val ignored = Scan.state --| ignore
|
||||||
>> (fn d => (NONE, (Ignore_Token, ("", d))));
|
>> (fn d => (NONE, (Ignore, ("", d))));
|
||||||
|
|
||||||
fun markup pred mk flag = Scan.peek (fn d =>
|
fun markup pred mk flag = Scan.peek (fn d =>
|
||||||
Document_Source.improper
|
Document_Source.improper
|
||||||
|
@ -399,29 +401,29 @@ fun present_thy options thy (segments: segment list) =
|
||||||
Scan.optional (Scan.one Token.is_command_modifier ::: Document_Source.improper) [] --
|
Scan.optional (Scan.one Token.is_command_modifier ::: Document_Source.improper) [] --
|
||||||
Scan.one Token.is_command --| Document_Source.annotation
|
Scan.one Token.is_command --| Document_Source.annotation
|
||||||
>> (fn (cmd_mod, cmd) =>
|
>> (fn (cmd_mod, cmd) =>
|
||||||
map (fn tok => (NONE, (Basic_Token tok, ("", d)))) cmd_mod @
|
map (fn tok => (NONE, (Token tok, ("", d)))) cmd_mod @
|
||||||
[(SOME (Token.content_of cmd, Token.pos_of cmd),
|
[(SOME (Token.content_of cmd, Token.pos_of cmd),
|
||||||
(Basic_Token cmd, (markup_false, d)))]));
|
(Token cmd, (markup_false, d)))]));
|
||||||
|
|
||||||
val cmt = Scan.peek (fn d =>
|
val cmt = Scan.peek (fn d =>
|
||||||
Scan.one Document_Source.is_black_comment >> (fn tok => (NONE, (Basic_Token tok, ("", d)))));
|
Scan.one Document_Source.is_black_comment >> (fn tok => (NONE, (Token tok, ("", d)))));
|
||||||
|
|
||||||
val other = Scan.peek (fn d =>
|
val other = Scan.peek (fn d =>
|
||||||
Parse.not_eof >> (fn tok => (NONE, (Basic_Token tok, ("", d)))));
|
Parse.not_eof >> (fn tok => (NONE, (Token tok, ("", d)))));
|
||||||
|
|
||||||
val tokens =
|
val tokens =
|
||||||
(ignored ||
|
(ignored ||
|
||||||
markup Keyword.is_document_heading Markup_Token markup_true ||
|
markup Keyword.is_document_heading Heading markup_true ||
|
||||||
markup Keyword.is_document_body Markup_Env_Token markup_true ||
|
markup Keyword.is_document_body Body markup_true ||
|
||||||
markup Keyword.is_document_raw (Raw_Token o #3) "") >> single ||
|
markup Keyword.is_document_raw (Raw o #3) "") >> single ||
|
||||||
command ||
|
command ||
|
||||||
(cmt || other) >> single;
|
(cmt || other) >> single;
|
||||||
|
|
||||||
|
|
||||||
(* spans *)
|
(* spans *)
|
||||||
|
|
||||||
val is_eof = fn (_, (Basic_Token x, _)) => Token.is_eof x | _ => false;
|
val is_eof = fn (_, (Token x, _)) => Token.is_eof x | _ => false;
|
||||||
val stopper = Scan.stopper (K (NONE, (Basic_Token Token.eof, ("", 0)))) is_eof;
|
val stopper = Scan.stopper (K (NONE, (Token Token.eof, ("", 0)))) is_eof;
|
||||||
|
|
||||||
val cmd = Scan.one (is_some o fst);
|
val cmd = Scan.one (is_some o fst);
|
||||||
val non_cmd = Scan.one (is_none o fst andf not o is_eof) >> #2;
|
val non_cmd = Scan.one (is_none o fst andf not o is_eof) >> #2;
|
||||||
|
@ -488,7 +490,7 @@ end;
|
||||||
(* pretty printing *)
|
(* pretty printing *)
|
||||||
|
|
||||||
fun pretty_term ctxt t =
|
fun pretty_term ctxt t =
|
||||||
Syntax.pretty_term (Variable.auto_fixes t ctxt) t;
|
Syntax.pretty_term (Proof_Context.augment t ctxt) t;
|
||||||
|
|
||||||
fun pretty_thm ctxt = pretty_term ctxt o Thm.full_prop_of;
|
fun pretty_thm ctxt = pretty_term ctxt o Thm.full_prop_of;
|
||||||
|
|
||||||
|
|
|
@ -14,8 +14,8 @@
|
||||||
theory
|
theory
|
||||||
AssnsLemmaThmEtc
|
AssnsLemmaThmEtc
|
||||||
imports
|
imports
|
||||||
"../ontologies/Conceptual/Conceptual"
|
"Isabelle_DOF.Conceptual"
|
||||||
"../ontologies/math_paper/math_paper"
|
"Isabelle_DOF.math_paper"
|
||||||
begin
|
begin
|
||||||
|
|
||||||
section\<open>Elementary Creation of Doc-items and Access of their Attibutes\<close>
|
section\<open>Elementary Creation of Doc-items and Access of their Attibutes\<close>
|
||||||
|
|
|
@ -14,7 +14,7 @@
|
||||||
theory
|
theory
|
||||||
Attributes
|
Attributes
|
||||||
imports
|
imports
|
||||||
"../ontologies/Conceptual/Conceptual"
|
"Isabelle_DOF.Conceptual"
|
||||||
begin
|
begin
|
||||||
|
|
||||||
section\<open>Elementary Creation of Doc-items and Access of their Attibutes\<close>
|
section\<open>Elementary Creation of Doc-items and Access of their Attibutes\<close>
|
||||||
|
|
|
@ -16,10 +16,10 @@ chapter\<open>Setting and modifying attributes of doc-items\<close>
|
||||||
theory
|
theory
|
||||||
Concept_Example
|
Concept_Example
|
||||||
imports
|
imports
|
||||||
"../ontologies/Conceptual/Conceptual" (* we use the generic "Conceptual" ontology *)
|
"Isabelle_DOF.Conceptual" (* we use the generic "Conceptual" ontology *)
|
||||||
begin
|
begin
|
||||||
|
|
||||||
text\<open>@{theory "Isabelle_DOF-tests.Conceptual"} provides a monitor @{typ M} enforcing a particular document structure.
|
text\<open>@{theory "Isabelle_DOF.Conceptual"} provides a monitor @{typ M} enforcing a particular document structure.
|
||||||
Here, we say: From now on, this structural rules are respected wrt. all doc\_classes M is
|
Here, we say: From now on, this structural rules are respected wrt. all doc\_classes M is
|
||||||
enabled for.\<close>
|
enabled for.\<close>
|
||||||
open_monitor*[struct::M]
|
open_monitor*[struct::M]
|
||||||
|
|
|
@ -16,7 +16,7 @@ chapter\<open>Setting and modifying attributes of doc-items\<close>
|
||||||
theory
|
theory
|
||||||
Concept_ExampleInvariant
|
Concept_ExampleInvariant
|
||||||
imports
|
imports
|
||||||
"../ontologies/Conceptual/Conceptual" (* we use the generic "Conceptual" ontology *)
|
"Isabelle_DOF.Conceptual" (* we use the generic "Conceptual" ontology *)
|
||||||
begin
|
begin
|
||||||
|
|
||||||
section\<open>Example: Standard Class Invariant\<close>
|
section\<open>Example: Standard Class Invariant\<close>
|
||||||
|
|
|
@ -16,7 +16,7 @@ chapter\<open>Inner Syntax Antiquotations (ISA)'s\<close>
|
||||||
theory
|
theory
|
||||||
InnerSyntaxAntiquotations
|
InnerSyntaxAntiquotations
|
||||||
imports
|
imports
|
||||||
"../ontologies/Conceptual/Conceptual"
|
"Isabelle_DOF.Conceptual"
|
||||||
begin
|
begin
|
||||||
|
|
||||||
text\<open>Since the syntax chosen for values of doc-class attributes is HOL-syntax --- requiring
|
text\<open>Since the syntax chosen for values of doc-class attributes is HOL-syntax --- requiring
|
||||||
|
|
|
@ -0,0 +1,8 @@
|
||||||
|
session "Isabelle_DOF-tests" = "Isabelle_DOF" +
|
||||||
|
options [document = false]
|
||||||
|
theories
|
||||||
|
"AssnsLemmaThmEtc"
|
||||||
|
"Concept_ExampleInvariant"
|
||||||
|
"Concept_Example"
|
||||||
|
"InnerSyntaxAntiquotations"
|
||||||
|
"Attributes"
|
Reference in New Issue