Merge remote-tracking branch 'origin/Unreleased/Isabelle2020-RC4'

This commit is contained in:
Achim D. Brucker 2020-04-21 08:37:59 +01:00
commit c320b58dd3
23 changed files with 128 additions and 1080 deletions

View File

@ -6,9 +6,9 @@ DOF_LATEST_DOI="10.5281/zenodo.3370483"
DOF_GENERIC_DOI="10.5281/zenodo.3370482"
#
# Isabelle and AFP Configuration
ISABELLE_VERSION="Isabelle2019: June 2019"
ISABELLE_URL="https://isabelle.in.tum.de/website-Isabelle2019/"
AFP_DATE="afp-2019-06-17"
ISABELLE_VERSION="Isabelle2020: April 2020"
ISABELLE_URL="https://isabelle.in.tum.de/website-Isabelle2020/"
AFP_DATE="afp-2020-04-20"
AFP_URL="https://www.isa-afp.org/release/"$AFP_DATE".tar.gz"
#
# Isabelle/DOF Repository Configuration

View File

@ -20,9 +20,9 @@ foo@bar:~$ docker run -ti --rm -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-u
Isabelle/DOF has two major pre-requisites:
* **Isabelle:** Isabelle/DOF requires [Isabelle 2019](http://isabelle.in.tum.de/website-Isabelle2019/).
Please download the Isabelle 2019 distribution for your operating
system from the [Isabelle website](http://isabelle.in.tum.de/website-Isabelle2019/).
* **Isabelle:** Isabelle/DOF requires [Isabelle 2020](http://isabelle.in.tum.de/website-Isabelle2020/).
Please download the Isabelle 2020 distribution for your operating
system from the [Isabelle website](http://isabelle.in.tum.de/website-Isabelle2020/).
* **LaTeX:** Isabelle/DOF requires a modern pdfTeX-engine supporting the \expanded{}-primitive. This
is, for example, included in the [TeXLive 2019](https://www.tug.org/texlive/) (or later)
distribution. Please follow the [TeXLive installation instructions](https://www.tug.org/texlive/acquire-netinstall.html)
@ -41,7 +41,7 @@ one), the full path to the ``isabelle`` command needs to be passed as
using the ``--isabelle`` command line argument of the ``install`` script:
```console
foo@bar:~$ ./install --isabelle /usr/local/Isabelle2019/bin/isabelle
foo@bar:~$ ./install --isabelle /usr/local/Isabelle2020/bin/isabelle
```
For further command line options of the installer, please use the

View File

@ -51,7 +51,7 @@ subsubsection*[prerequisites::technical]\<open>Pre-requisites\<close>
text\<open>
\isadof has to major pre-requisites:
\<^item> \<^bold>\<open>Isabelle\<close>\bindex{Isabelle} (\isabellefullversion).
\isadof uses a two-part version system (e.g., 1.0.0/2019),
\isadof uses a two-part version system (e.g., 1.0.0/2020),
where the first part is the version of \isadof (using semantic versioning) and the second
part is the supported version of Isabelle. Thus, the same version of \isadof might be
available for different versions of Isabelle.
@ -266,7 +266,7 @@ users are:
\<^item> The file \inlinebash|ROOT|\index{ROOT}, which defines the Isabelle session. New theory files as well as new
files required by the document generation (\eg, images, bibliography database using \BibTeX, local
\LaTeX-styles) need to be registered in this file. For details of Isabelle's build system, please
consult the Isabelle System Manual~@{cite "wenzel:system-manual:2019"}.
consult the Isabelle System Manual~@{cite "wenzel:system-manual:2020"}.
\<^item> The file \inlinebash|praemble.tex|\index{praemble.tex}, which allows users to add additional
\LaTeX-packages or to add/modify \LaTeX-commands.
\<close>
@ -552,7 +552,7 @@ figure*[figfig3::figure,relative_width="95",src="''figures/antiquotations-PIDE''
\<open> Standard antiquotations referring to theory elements.\<close>
text\<open> The corresponding view in @{docitem \<open>figfig3\<close>} shows core part of a document
conformimg to the CENELEC 50128 ontology. The first sample shows standard Isabelle antiquotations
@{cite "wenzel:isabelle-isar:2019"} into formal entities of a theory. This way, the informal parts
@{cite "wenzel:isabelle-isar:2020"} into formal entities of a theory. This way, the informal parts
of a document get ``formal content'' and become more robust under change.\<close>
figure*[figfig5::figure, relative_width="95", src="''figures/srac-definition''"]

View File

@ -220,22 +220,22 @@ text\<open>
As ODL is an extension of Isabelle/HOL, document class definitions can therefore be arbitrarily
mixed with standard HOL specification constructs. To make this manual self-contained, we present
syntax and semantics of the specification constructs that are most likely relevant for the
developer of ontologies (for more details, see~@{cite "wenzel:isabelle-isar:2019"}. Our
developer of ontologies (for more details, see~@{cite "wenzel:isabelle-isar:2020"}. Our
presentation is a simplification of the original sources following the needs of ontology developers
in \isadof:
\<^item> \<open>name\<close>:\index{name@\<open>name\<close>}
with the syntactic category of \<open>name\<close>'s we refer to alpha-numerical identifiers
(called \<open>short_id\<close>'s in @{cite "wenzel:isabelle-isar:2019"}) and identifiers
(called \<open>short_id\<close>'s in @{cite "wenzel:isabelle-isar:2020"}) and identifiers
in \inlineisar+" ... "+ which might contain certain ``quasi-letters'' such
as \inlineisar+_+, \inlineisar+-+, \inlineisar+.+ (see~@{cite "wenzel:isabelle-isar:2019"} for
as \inlineisar+_+, \inlineisar+-+, \inlineisar+.+ (see~@{cite "wenzel:isabelle-isar:2020"} for
details).
\<^item> \<open>tyargs\<close>:\index{tyargs@\<open>tyargs\<close>}
\<^rail>\<open> typefree | ('(' (typefree * ',') ')')\<close>
\<open>typefree\<close> denotes fixed type variable(\<open>'a\<close>, \<open>'b\<close>, ...) (see~@{cite "wenzel:isabelle-isar:2019"})
\<open>typefree\<close> denotes fixed type variable(\<open>'a\<close>, \<open>'b\<close>, ...) (see~@{cite "wenzel:isabelle-isar:2020"})
\<^item> \<open>dt_name\<close>:\index{dt\_npurdahame@\<open>dt_name\<close>}
\<^rail>\<open> (tyargs?) name (mixfix?)\<close>
The syntactic entity \<open>name\<close> denotes an identifier, \<open>mixfix\<close> denotes the usual
parenthesized mixfix notation (see @{cite "wenzel:isabelle-isar:2019"}).
parenthesized mixfix notation (see @{cite "wenzel:isabelle-isar:2020"}).
The \<open>name\<close>'s referred here are type names such as \<^verbatim>\<open>int\<close>, \<^verbatim>\<open>string\<close>, \<^verbatim>\<open>list\<close>, \<^verbatim>\<open>set\<close>, etc.
\<^item> \<open>type_spec\<close>:\index{type_spec@\<open>type_spec\<close>}
\<^rail>\<open> (tyargs?) name\<close>
@ -256,13 +256,13 @@ text\<open>
mathematical notations for $\lambda$-terms in Isabelle/HOL. Example expressions are:
\inlineisar|1+2| (arithmetics), \inlineisar|[1,2,3]| (lists), \inlineisar|''ab c''| (strings),
\inlineisar|{1,2,3}| (sets), \inlineisar|(1,2,3)| (tuples),
\inlineisar|\<forall> x. P(x) \<and> Q x = C| (formulas). For details, see~@{cite "nipkow:whats:2019"}.
\inlineisar|\<forall> x. P(x) \<and> Q x = C| (formulas). For details, see~@{cite "nipkow:whats:2020"}.
\<close>
text\<open>
Advanced ontologies can, \eg, use recursive function definitions with
pattern-matching~@{cite "kraus:defining:2019"}, extensible record
pecifications~@{cite "wenzel:isabelle-isar:2019"}, and abstract type declarations.
pattern-matching~@{cite "kraus:defining:2020"}, extensible record
pecifications~@{cite "wenzel:isabelle-isar:2020"}, and abstract type declarations.
\<close>
text\<open>Note that \isadof works internally with fully qualified names in order to avoid
@ -554,7 +554,7 @@ subsubsection\<open>Experts: Defining New Top-Level Commands\<close>
text\<open>
Defining such new top-level commands requires some Isabelle knowledge as well as
extending the dispatcher of the \LaTeX-backend. For the details of defining top-level
commands, we refer the reader to the Isar manual~@{cite "wenzel:isabelle-isar:2019"}.
commands, we refer the reader to the Isar manual~@{cite "wenzel:isabelle-isar:2020"}.
Here, we only give a brief example how the \inlineisar|section*|-command is defined; we
refer the reader to the source code of \isadof for details.

View File

@ -33,7 +33,7 @@ text\<open>
\<close>
text\<open>
Semantic macros, as required by our document model, are called \<^emph>\<open>document antiquotations\<close>
in the Isabelle literature~@{cite "wenzel:isabelle-isar:2019"}. While Isabelle's code-antiquotations
in the Isabelle literature~@{cite "wenzel:isabelle-isar:2020"}. While Isabelle's code-antiquotations
are an old concept going back to Lisp and having found via SML and OCaml their ways into modern
proof systems, special annotation syntax inside documentation comments have their roots in
documentation generators such as Javadoc. Their use, however, as a mechanism to embed

View File

@ -17,10 +17,10 @@
year = 2019
}
@Manual{ wenzel:isabelle-isar:2019,
@Manual{ wenzel:isabelle-isar:2020,
title = {The Isabelle/Isar Reference Manual},
author = {Makarius Wenzel},
year = 2019,
year = 2020,
note = {Part of the Isabelle distribution.}
}
@ -41,7 +41,7 @@
ontological feedback during the editing of a document.
In this paper, we give an in-depth presentation of the
design concepts of DOF{\^a}s Ontology Definition Language
design concepts of DOF's Ontology Definition Language
(ODL) and key aspects of the technology of its
implementation. Isabelle/DOF is the first ontology language
supporting machine-checked links between the formal and
@ -321,18 +321,18 @@
year = 2019
}
@Misc{ kraus:defining:2019,
@Misc{ kraus:defining:2020,
title = {Defining Recursive Functions in Isabelle/HOL},
author = {Alexander Kraus},
note = {\url{https://isabelle.in.tum.de/doc/functions.pdf}},
year = 2019
year = 2020
}
@Misc{ nipkow:whats:2019,
@Misc{ nipkow:whats:2020,
title = {What's in Main},
author = {Tobias Nipkow},
note = {\url{https://isabelle.in.tum.de/doc/main.pdf}},
year = 2019
year = 2020
}
@InProceedings{ wenzel:system:2014,
@ -455,10 +455,10 @@
year = 2019
}
@Booklet{ wenzel:system-manual:2019,
@Booklet{ wenzel:system-manual:2020,
author = {Makarius Wenzel},
title = {The {Isabelle} System Manual},
year = 2019,
year = 2020,
note = {Part of the Isabelle distribution.}
}

View File

@ -14,7 +14,6 @@
(*<*)
theory TR_MyCommentedIsabelle
imports "Isabelle_DOF.technical_report"
(*imports "../../../ontologies/technical_report"*)
begin
@ -22,7 +21,7 @@ open_monitor*[this::report]
(*>*)
title*[tit::title]\<open>My Personal, Ecclectic Isabelle Programming Manual\<close>
subtitle*[stit::subtitle]\<open>Version : Isabelle 2019\<close>
subtitle*[stit::subtitle]\<open>Version : Isabelle 2020\<close>
text*[bu::author,
email = "''wolff@lri.fr''",
affiliation = "\<open>Université Paris-Saclay, LRI, France\<close>"]\<open>Burkhart Wolff\<close>
@ -784,9 +783,9 @@ Theory.setup: (theory -> theory) -> unit; (* The thing to extend the table of "
Theory.get_markup: theory -> Markup.T;
Theory.axiom_table: theory -> term Name_Space.table;
Theory.axiom_space: theory -> Name_Space.T;
Theory.axioms_of: theory -> (string * term) list;
Theory.all_axioms_of: theory -> (string * term) list;
Theory.defs_of: theory -> Defs.T;
Theory.join_theory: theory list -> theory;
Theory.at_begin: (theory -> theory option) -> theory -> theory;
Theory.at_end: (theory -> theory option) -> theory -> theory;
Theory.begin_theory: string * Position.T -> theory list -> theory;
@ -1016,7 +1015,7 @@ fun derive_thm name term lthy =
[] (* local assumption context *)
(term) (* parsed goal *)
(fn _ => simp_tac lthy 1) (* proof tactic *)
|> Thm.close_derivation (* some cleanups *);
|> Thm.close_derivation \<^here> (* some cleanups *);
val thm111_intern = derive_thm "thm111" tt @{context} (* just for fun at the ML level *)
@ -1035,7 +1034,7 @@ text\<open>Converting a local theory transformation into a global one:\<close>
setup\<open>SimpleSampleProof.prove_n_store\<close>
text\<open>... and there it is in the global (Isar) context:\<close>
thm thm111
thm "thm111"
@ -2038,7 +2037,7 @@ text\<open>The heart of the LaTeX generator is to be found in the structure @{ML
\<close>
ML\<open>
local
local
open Latex

View File

@ -1,7 +1,7 @@
#!/usr/bin/env bash
# Copyright (c) 2018-2019 The University of Sheffield.
# 2019-2019 The University of Exeter.
# 2018-2019 The University of Paris-Saclay.
# 2019-2020 The University of Exeter.
# 2018-2020 The University of Paris-Saclay.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions

View File

@ -48,7 +48,7 @@ fun assert_cmd some_name modes raw_t ctxt (* state*) =
val ty' = Term.type_of t';
val ty' = case ty' of @{typ "bool"} => ty' | _ => error "Assertion expressions must be boolean.";
val t' = case t' of @{term "True"} => t' | _ => error "Assertion failed.";
val ctxt' = Variable.auto_fixes t' ctxt;
val ctxt' = Proof_Context.augment t' ctxt;
val p = Print_Mode.with_modes modes (fn () =>
Pretty.block [Pretty.quote (Syntax.pretty_term ctxt' t'), Pretty.fbrk,
Pretty.str "::", Pretty.brk 1, Pretty.quote (Syntax.pretty_typ ctxt' ty')]) ();

View File

@ -2,15 +2,32 @@ session "Isabelle_DOF" = "Functional-Automata" +
options [document = pdf, document_output = "output"]
sessions
"Regular-Sets"
directories
"DOF"
"ontologies"
"ontologies/CENELEC_50128"
"ontologies/Conceptual"
"ontologies/math_exam"
"ontologies/math_paper"
"ontologies/scholarly_paper"
"ontologies/small_math"
"ontologies/technical_report"
theories
"DOF/Isa_DOF"
"ontologies/ontologies"
(*
session "Isabelle_DOF-tests" = "Isabelle_DOF" +
options [document = false]
directories
tests
theories
"tests/AssnsLemmaThmEtc"
"tests/Concept_ExampleInvariant"
"tests/Concept_Example"
"tests/InnerSyntaxAntiquotations"
"tests/Attributes"
*)

1
src/ROOTS Normal file
View File

@ -0,0 +1 @@
tests

View File

@ -14,7 +14,7 @@
\NeedsTeXFormat{LaTeX2e}\relax
\ProvidesPackage{DOF-cenelec_50128}
[2019/08/18 Unreleased/Isabelle2019%
[2019/08/18 Unreleased/Isabelle2020%
Document-Type Support Framework for Isabelle (CENELEC 50128).]
\RequirePackage{DOF-COL}

View File

@ -14,7 +14,7 @@
\NeedsTeXFormat{LaTeX2e}\relax
\ProvidesPackage{DOF-scholarly_paper}
[2020/01/14 Unreleased/Isabelle2019%
[2020/01/14 Unreleased/Isabelle2020%
Document-Type Support Framework for Isabelle (LNCS).]
\RequirePackage{DOF-COL}

View File

@ -1,495 +0,0 @@
(* Title: Pure/Thy/thy_info.ML
Author: Markus Wenzel, TU Muenchen
Global theory info database, with auto-loading according to theory and
file dependencies.
*)
signature THY_INFO =
sig
type presentation_context =
{options: Options.T, file_pos: Position.T, adjust_pos: Position.T -> Position.T,
segments: Thy_Output.segment list}
val apply_presentation: presentation_context -> theory -> unit
val add_presentation: (presentation_context -> theory -> unit) -> theory -> theory
val get_names: unit -> string list
val lookup_theory: string -> theory option
val get_theory: string -> theory
val master_directory: string -> Path.T
val remove_thy: string -> unit
type context =
{options: Options.T,
symbols: HTML.symbols,
bibtex_entries: string list,
last_timing: Toplevel.transition -> Time.time}
val use_theories: context -> string -> Path.T -> (string * Position.T) list -> unit
val use_thy: string -> unit
val script_thy: Position.T -> string -> theory -> theory
val register_thy: theory -> unit
val finish: unit -> unit
end;
structure Thy_Info: THY_INFO =
struct
(** presentation of consolidated theory **)
type presentation_context =
{options: Options.T, file_pos: Position.T, adjust_pos: Position.T -> Position.T,
segments: Thy_Output.segment list};
structure Presentation = Theory_Data
(
type T = ((presentation_context -> theory -> unit) * stamp) list;
val empty = [];
val extend = I;
fun merge data : T = Library.merge (eq_snd op =) data;
);
fun apply_presentation (context: presentation_context) thy =
ignore (Presentation.get thy |> Par_List.map (fn (f, _) => f context thy));
fun add_presentation f = Presentation.map (cons (f, stamp ()));
val _ =
Theory.setup (add_presentation (fn {options, file_pos, segments, ...} => fn thy =>
if exists (Toplevel.is_skipped_proof o #state) segments then ()
else
let
val body = Thy_Output.present_thy options thy segments;
val option = Present.document_option options;
in
if #disabled option then ()
else
let
val latex = Latex.isabelle_body (Context.theory_name thy) body;
val output = [Latex.output_text latex, Latex.output_positions file_pos latex];
val _ =
if Options.bool options "export_document"
then Export.export thy "document.tex" output else ();
val _ = if #enabled option then Present.theory_output thy output else ();
in () end
end));
(** thy database **)
(* messages *)
val show_path = space_implode " via " o map quote;
fun cycle_msg names = "Cyclic dependency of " ^ show_path names;
(* derived graph operations *)
fun add_deps name parents G = String_Graph.add_deps_acyclic (name, parents) G
handle String_Graph.CYCLES namess => error (cat_lines (map cycle_msg namess));
fun new_entry name parents entry =
String_Graph.new_node (name, entry) #> add_deps name parents;
(* global thys *)
type deps =
{master: (Path.T * SHA1.digest), (*master dependencies for thy file*)
imports: (string * Position.T) list}; (*source specification of imports (partially qualified)*)
fun make_deps master imports : deps = {master = master, imports = imports};
fun master_dir_deps (d: deps option) =
the_default Path.current (Option.map (Path.dir o #1 o #master) d);
local
val global_thys =
Synchronized.var "Thy_Info.thys"
(String_Graph.empty: (deps option * theory option) String_Graph.T);
in
fun get_thys () = Synchronized.value global_thys;
fun change_thys f = Synchronized.change global_thys f;
end;
fun get_names () = String_Graph.topological_order (get_thys ());
(* access thy *)
fun lookup thys name = try (String_Graph.get_node thys) name;
fun lookup_thy name = lookup (get_thys ()) name;
fun get thys name =
(case lookup thys name of
SOME thy => thy
| NONE => error ("Theory loader: nothing known about theory " ^ quote name));
fun get_thy name = get (get_thys ()) name;
(* access deps *)
val lookup_deps = Option.map #1 o lookup_thy;
val master_directory = master_dir_deps o #1 o get_thy;
(* access theory *)
fun lookup_theory name =
(case lookup_thy name of
SOME (_, SOME theory) => SOME theory
| _ => NONE);
fun get_theory name =
(case lookup_theory name of
SOME theory => theory
| _ => error ("Theory loader: undefined entry for theory " ^ quote name));
val get_imports = Resources.imports_of o get_theory;
(** thy operations **)
(* remove *)
fun remove name thys =
(case lookup thys name of
NONE => thys
| SOME (NONE, _) => error ("Cannot update finished theory " ^ quote name)
| SOME _ =>
let
val succs = String_Graph.all_succs thys [name];
val _ = writeln ("Theory loader: removing " ^ commas_quote succs);
in fold String_Graph.del_node succs thys end);
val remove_thy = change_thys o remove;
(* update *)
fun update deps theory thys =
let
val name = Context.theory_long_name theory;
val parents = map Context.theory_long_name (Theory.parents_of theory);
val thys' = remove name thys;
val _ = map (get thys') parents;
in new_entry name parents (SOME deps, SOME theory) thys' end;
fun update_thy deps theory = change_thys (update deps theory);
(* context *)
type context =
{options: Options.T,
symbols: HTML.symbols,
bibtex_entries: string list,
last_timing: Toplevel.transition -> Time.time};
fun default_context (): context =
{options = Options.default (),
symbols = HTML.no_symbols,
bibtex_entries = [],
last_timing = K Time.zeroTime};
(* scheduling loader tasks *)
datatype result =
Result of {theory: theory, exec_id: Document_ID.exec,
present: unit -> unit, commit: unit -> unit, weight: int};
fun theory_result theory =
Result {theory = theory, exec_id = Document_ID.none, present = I, commit = I, weight = 0};
fun result_theory (Result {theory, ...}) = theory;
fun result_present (Result {present, ...}) = present;
fun result_commit (Result {commit, ...}) = commit;
fun result_ord (Result {weight = i, ...}, Result {weight = j, ...}) = int_ord (j, i);
fun join_theory (Result {theory, exec_id, ...}) =
let
val _ = Execution.join [exec_id];
val res = Exn.capture Thm.consolidate_theory theory;
val exns = maps Task_Queue.group_status (Execution.peek exec_id);
in res :: map Exn.Exn exns end;
datatype task =
Task of string list * (theory list -> result) |
Finished of theory;
fun task_finished (Task _) = false
| task_finished (Finished _) = true;
fun task_parents deps (parents: string list) = map (the o AList.lookup (op =) deps) parents;
val schedule_seq =
String_Graph.schedule (fn deps => fn (_, task) =>
(case task of
Task (parents, body) =>
let
val result = body (task_parents deps parents);
val _ = Par_Exn.release_all (join_theory result);
val _ = result_present result ();
val _ = result_commit result ();
in result_theory result end
| Finished thy => thy)) #> ignore;
val schedule_futures = Thread_Attributes.uninterruptible (fn _ => fn tasks =>
let
val futures = tasks
|> String_Graph.schedule (fn deps => fn (name, task) =>
(case task of
Task (parents, body) =>
(singleton o Future.forks)
{name = "theory:" ^ name, group = NONE,
deps = map (Future.task_of o #2) deps, pri = 0, interrupts = true}
(fn () =>
(case filter (not o can Future.join o #2) deps of
[] => body (map (result_theory o Future.join) (task_parents deps parents))
| bad =>
error
("Failed to load theory " ^ quote name ^
" (unresolved " ^ commas_quote (map #1 bad) ^ ")")))
| Finished theory => Future.value (theory_result theory)));
val results1 = futures
|> maps (fn future =>
(case Future.join_result future of
Exn.Res result => join_theory result
| Exn.Exn exn => [Exn.Exn exn]));
val results2 = futures
|> map_filter (Exn.get_res o Future.join_result)
|> sort result_ord
|> Par_List.map (fn result => Exn.capture (result_present result) ());
(* FIXME more precise commit order (!?) *)
val results3 = futures
|> map (fn future => Exn.capture (fn () => result_commit (Future.join future) ()) ());
(* FIXME avoid global Execution.reset (!??) *)
val results4 = map Exn.Exn (maps Task_Queue.group_status (Execution.reset ()));
val _ = Par_Exn.release_all (results1 @ results2 @ results3 @ results4);
in () end);
(* eval theory *)
fun excursion keywords master_dir last_timing init elements =
let
fun prepare_span st span =
Command_Span.content span
|> Command.read keywords (Command.read_thy st) master_dir init ([], ~1)
|> (fn tr => Toplevel.put_timing (last_timing tr) tr);
fun element_result span_elem (st, _) =
let
val elem = Thy_Syntax.map_element (prepare_span st) span_elem;
val (results, st') = Toplevel.element_result keywords elem st;
val pos' = Toplevel.pos_of (Thy_Syntax.last_element elem);
in (results, (st', pos')) end;
val (results, (end_state, end_pos)) =
fold_map element_result elements (Toplevel.toplevel, Position.none);
val thy = Toplevel.end_theory end_pos end_state;
in (results, thy) end;
fun eval_thy (context: context) update_time master_dir header text_pos text parents =
let
val {options, symbols, bibtex_entries, last_timing} = context;
val (name, _) = #name header;
val keywords =
fold (curry Keyword.merge_keywords o Thy_Header.get_keywords) parents
(Keyword.add_keywords (#keywords header) Keyword.empty_keywords);
val _ = writeln "eval_thy 1";
val spans = Outer_Syntax.parse_spans (Token.explode keywords text_pos text);
val _ = writeln "eval_thy 2";
val elements = Thy_Syntax.parse_elements keywords spans;
val _ = writeln "eval_thy 3";
fun init () =
Resources.begin_theory master_dir header parents
|> Present.begin_theory bibtex_entries update_time
(fn () => implode (map (HTML.present_span symbols keywords) spans));
val (results, thy) =
cond_timeit true ("theory " ^ quote name)
(fn () => excursion keywords master_dir last_timing init elements);
fun present () =
let
val _ = writeln "eval_thy 4 - present"
val segments = (spans ~~ maps Toplevel.join_results results)
|> map (fn (span, (tr, st')) => {span = span, command = tr, state = st'});
val X = if null segments then (writeln "eval_thy 5 - no segments";[])
else List.concat(map (fn X => Command_Span.content (#span X)) segments)
val Y = (String.concatWith "::") (map Token.content_of X)
val _ = writeln("eval_thy 5 BEGIN\n"^Y^"eval_thy 6 END:"^Context.theory_name thy^"\n")
val context: presentation_context =
{options = options, file_pos = text_pos, adjust_pos = I, segments = segments};
in apply_presentation context thy end;
in (thy, present, size text) end;
(* require_thy -- checking database entries wrt. the file-system *)
local
fun required_by _ [] = ""
| required_by s initiators = s ^ "(required by " ^ show_path (rev initiators) ^ ")";
fun load_thy context initiators update_time deps text (name, pos) keywords parents =
let
val _ = remove_thy name;
val _ = writeln ("Loading theory " ^ quote name ^ required_by " " initiators);
val _ = Output.try_protocol_message (Markup.loading_theory name) [];
val {master = (thy_path, _), imports} = deps;
val dir = Path.dir thy_path;
val header = Thy_Header.make (name, pos) imports keywords;
val _ = Position.reports (map #2 imports ~~ map Theory.get_markup parents);
val exec_id = Document_ID.make ();
val _ =
Execution.running Document_ID.none exec_id [] orelse
raise Fail ("Failed to register execution: " ^ Document_ID.print exec_id);
val timing_start = Timing.start ();
val text_pos = Position.put_id (Document_ID.print exec_id) (Path.position thy_path);
val (theory, present, weight) =
eval_thy context update_time dir header text_pos text
(if name = Context.PureN then [Context.the_global_context ()] else parents);
val timing_result = Timing.result timing_start;
val timing_props = [Markup.theory_timing, (Markup.nameN, name)];
val _ = Output.try_protocol_message (timing_props @ Markup.timing_properties timing_result) []
fun commit () = update_thy deps theory;
in
Result {theory = theory, exec_id = exec_id, present = present, commit = commit, weight = weight}
end;
fun check_deps dir name =
(case lookup_deps name of
SOME NONE => (true, NONE, Position.none, get_imports name, [])
| NONE =>
let val {master, text, theory_pos, imports, keywords} = Resources.check_thy dir name
in (false, SOME (make_deps master imports, text), theory_pos, imports, keywords) end
| SOME (SOME {master, ...}) =>
let
val {master = master', text = text', theory_pos = theory_pos', imports = imports',
keywords = keywords'} = Resources.check_thy dir name;
val deps' = SOME (make_deps master' imports', text');
val current =
#2 master = #2 master' andalso
(case lookup_theory name of
NONE => false
| SOME theory => Resources.loaded_files_current theory);
in (current, deps', theory_pos', imports', keywords') end);
in
fun require_thys context initiators qualifier dir strs tasks =
fold_map (require_thy context initiators qualifier dir) strs tasks |>> forall I
and require_thy context initiators qualifier dir (s, require_pos) tasks =
let
val {master_dir, theory_name, ...} = Resources.import_name qualifier dir s;
in
(case try (String_Graph.get_node tasks) theory_name of
SOME task => (task_finished task, tasks)
| NONE =>
let
val _ = member (op =) initiators theory_name andalso error (cycle_msg initiators);
val (current, deps, theory_pos, imports, keywords) = check_deps master_dir theory_name
handle ERROR msg =>
cat_error msg
("The error(s) above occurred for theory " ^ quote theory_name ^
Position.here require_pos ^ required_by "\n" initiators);
val qualifier' = Resources.theory_qualifier theory_name;
val dir' = Path.append dir (master_dir_deps (Option.map #1 deps));
val parents = map (#theory_name o Resources.import_name qualifier' dir' o #1) imports;
val (parents_current, tasks') =
require_thys context (theory_name :: initiators) qualifier' dir' imports tasks;
val all_current = current andalso parents_current;
val task =
if all_current then Finished (get_theory theory_name)
else
(case deps of
NONE => raise Fail "Malformed deps"
| SOME (dep, text) =>
let
val update_time = serial ();
val load =
load_thy context initiators update_time
dep text (theory_name, theory_pos) keywords;
in Task (parents, load) end);
val tasks'' = new_entry theory_name parents task tasks';
in (all_current, tasks'') end)
end;
end;
(* use theories *)
fun use_theories context qualifier master_dir imports =
let val (_, tasks) = require_thys context [] qualifier master_dir imports String_Graph.empty
in if Multithreading.max_threads () > 1 then schedule_futures tasks else schedule_seq tasks end;
fun use_thy name =
use_theories (default_context ()) Resources.default_qualifier
Path.current [(name, Position.none)];
(* toplevel scripting -- without maintaining database *)
fun script_thy pos txt thy =
let
val trs =
Outer_Syntax.parse thy pos txt
|> map (Toplevel.modify_init (K thy));
val end_pos = if null trs then pos else Toplevel.pos_of (List.last trs);
val end_state = fold (Toplevel.command_exception true) trs Toplevel.toplevel;
in Toplevel.end_theory end_pos end_state end;
(* register theory *)
fun register_thy theory =
let
val name = Context.theory_long_name theory;
val {master, ...} = Resources.check_thy (Resources.master_directory theory) name;
val imports = Resources.imports_of theory;
in
change_thys (fn thys =>
let
val thys' = remove name thys;
val _ = writeln ("Registering theory " ^ quote name);
in update (make_deps master imports) theory thys' end)
end;
(* finish all theories *)
fun finish () = change_thys (String_Graph.map (fn _ => fn (_, entry) => (NONE, entry)));
end;
fun use_thy name = Runtime.toplevel_program (fn () => Thy_Info.use_thy name);

View File

@ -1,484 +0,0 @@
(* Title: Pure/Thy/thy_info.ML
Author: Markus Wenzel, TU Muenchen
Global theory info database, with auto-loading according to theory and
file dependencies.
*)
signature THY_INFO =
sig
type presentation_context =
{options: Options.T, file_pos: Position.T, adjust_pos: Position.T -> Position.T,
segments: Thy_Output.segment list}
val apply_presentation: presentation_context -> theory -> unit
val add_presentation: (presentation_context -> theory -> unit) -> theory -> theory
val get_names: unit -> string list
val lookup_theory: string -> theory option
val get_theory: string -> theory
val master_directory: string -> Path.T
val remove_thy: string -> unit
type context =
{options: Options.T,
symbols: HTML.symbols,
bibtex_entries: string list,
last_timing: Toplevel.transition -> Time.time}
val use_theories: context -> string -> Path.T -> (string * Position.T) list -> unit
val use_thy: string -> unit
val script_thy: Position.T -> string -> theory -> theory
val register_thy: theory -> unit
val finish: unit -> unit
end;
structure Thy_Info: THY_INFO =
struct
(** presentation of consolidated theory **)
type presentation_context =
{options: Options.T, file_pos: Position.T, adjust_pos: Position.T -> Position.T,
segments: Thy_Output.segment list};
structure Presentation = Theory_Data
(
type T = ((presentation_context -> theory -> unit) * stamp) list;
val empty = [];
val extend = I;
fun merge data : T = Library.merge (eq_snd op =) data;
);
fun apply_presentation (context: presentation_context) thy =
ignore (Presentation.get thy |> Par_List.map (fn (f, _) => f context thy));
fun add_presentation f = Presentation.map (cons (f, stamp ()));
val _ =
Theory.setup (add_presentation (fn {options, file_pos, segments, ...} => fn thy =>
if exists (Toplevel.is_skipped_proof o #state) segments then ()
else
let
val body = Thy_Output.present_thy options thy segments;
val option = Present.document_option options;
in
if #disabled option then ()
else
let
val latex = Latex.isabelle_body (Context.theory_name thy) body;
val output = [Latex.output_text latex, Latex.output_positions file_pos latex];
val _ =
if Options.bool options "export_document"
then Export.export thy "document.tex" output else ();
val _ = if #enabled option then Present.theory_output thy output else ();
in () end
end));
(** thy database **)
(* messages *)
val show_path = space_implode " via " o map quote;
fun cycle_msg names = "Cyclic dependency of " ^ show_path names;
(* derived graph operations *)
fun add_deps name parents G = String_Graph.add_deps_acyclic (name, parents) G
handle String_Graph.CYCLES namess => error (cat_lines (map cycle_msg namess));
fun new_entry name parents entry =
String_Graph.new_node (name, entry) #> add_deps name parents;
(* global thys *)
type deps =
{master: (Path.T * SHA1.digest), (*master dependencies for thy file*)
imports: (string * Position.T) list}; (*source specification of imports (partially qualified)*)
fun make_deps master imports : deps = {master = master, imports = imports};
fun master_dir_deps (d: deps option) =
the_default Path.current (Option.map (Path.dir o #1 o #master) d);
local
val global_thys =
Synchronized.var "Thy_Info.thys"
(String_Graph.empty: (deps option * theory option) String_Graph.T);
in
fun get_thys () = Synchronized.value global_thys;
fun change_thys f = Synchronized.change global_thys f;
end;
fun get_names () = String_Graph.topological_order (get_thys ());
(* access thy *)
fun lookup thys name = try (String_Graph.get_node thys) name;
fun lookup_thy name = lookup (get_thys ()) name;
fun get thys name =
(case lookup thys name of
SOME thy => thy
| NONE => error ("Theory loader: nothing known about theory " ^ quote name));
fun get_thy name = get (get_thys ()) name;
(* access deps *)
val lookup_deps = Option.map #1 o lookup_thy;
val master_directory = master_dir_deps o #1 o get_thy;
(* access theory *)
fun lookup_theory name =
(case lookup_thy name of
SOME (_, SOME theory) => SOME theory
| _ => NONE);
fun get_theory name =
(case lookup_theory name of
SOME theory => theory
| _ => error ("Theory loader: undefined entry for theory " ^ quote name));
val get_imports = Resources.imports_of o get_theory;
(** thy operations **)
(* remove *)
fun remove name thys =
(case lookup thys name of
NONE => thys
| SOME (NONE, _) => error ("Cannot update finished theory " ^ quote name)
| SOME _ =>
let
val succs = String_Graph.all_succs thys [name];
val _ = writeln ("Theory loader: removing " ^ commas_quote succs);
in fold String_Graph.del_node succs thys end);
val remove_thy = change_thys o remove;
(* update *)
fun update deps theory thys =
let
val name = Context.theory_long_name theory;
val parents = map Context.theory_long_name (Theory.parents_of theory);
val thys' = remove name thys;
val _ = map (get thys') parents;
in new_entry name parents (SOME deps, SOME theory) thys' end;
fun update_thy deps theory = change_thys (update deps theory);
(* context *)
type context =
{options: Options.T,
symbols: HTML.symbols,
bibtex_entries: string list,
last_timing: Toplevel.transition -> Time.time};
fun default_context (): context =
{options = Options.default (),
symbols = HTML.no_symbols,
bibtex_entries = [],
last_timing = K Time.zeroTime};
(* scheduling loader tasks *)
datatype result =
Result of {theory: theory, exec_id: Document_ID.exec,
present: unit -> unit, commit: unit -> unit, weight: int};
fun theory_result theory =
Result {theory = theory, exec_id = Document_ID.none, present = I, commit = I, weight = 0};
fun result_theory (Result {theory, ...}) = theory;
fun result_present (Result {present, ...}) = present;
fun result_commit (Result {commit, ...}) = commit;
fun result_ord (Result {weight = i, ...}, Result {weight = j, ...}) = int_ord (j, i);
fun join_theory (Result {theory, exec_id, ...}) =
let
val _ = Execution.join [exec_id];
val res = Exn.capture Thm.consolidate_theory theory;
val exns = maps Task_Queue.group_status (Execution.peek exec_id);
in res :: map Exn.Exn exns end;
datatype task =
Task of string list * (theory list -> result) |
Finished of theory;
fun task_finished (Task _) = false
| task_finished (Finished _) = true;
fun task_parents deps (parents: string list) = map (the o AList.lookup (op =) deps) parents;
val schedule_seq =
String_Graph.schedule (fn deps => fn (_, task) =>
(case task of
Task (parents, body) =>
let
val result = body (task_parents deps parents);
val _ = Par_Exn.release_all (join_theory result);
val _ = result_present result ();
val _ = result_commit result ();
in result_theory result end
| Finished thy => thy)) #> ignore;
val schedule_futures = Thread_Attributes.uninterruptible (fn _ => fn tasks =>
let
val futures = tasks
|> String_Graph.schedule (fn deps => fn (name, task) =>
(case task of
Task (parents, body) =>
(singleton o Future.forks)
{name = "theory:" ^ name, group = NONE,
deps = map (Future.task_of o #2) deps, pri = 0, interrupts = true}
(fn () =>
(case filter (not o can Future.join o #2) deps of
[] => body (map (result_theory o Future.join) (task_parents deps parents))
| bad =>
error
("Failed to load theory " ^ quote name ^
" (unresolved " ^ commas_quote (map #1 bad) ^ ")")))
| Finished theory => Future.value (theory_result theory)));
val results1 = futures
|> maps (fn future =>
(case Future.join_result future of
Exn.Res result => join_theory result
| Exn.Exn exn => [Exn.Exn exn]));
val results2 = futures
|> map_filter (Exn.get_res o Future.join_result)
|> sort result_ord
|> Par_List.map (fn result => Exn.capture (result_present result) ());
(* FIXME more precise commit order (!?) *)
val results3 = futures
|> map (fn future => Exn.capture (fn () => result_commit (Future.join future) ()) ());
(* FIXME avoid global Execution.reset (!??) *)
val results4 = map Exn.Exn (maps Task_Queue.group_status (Execution.reset ()));
val _ = Par_Exn.release_all (results1 @ results2 @ results3 @ results4);
in () end);
(* eval theory *)
fun excursion keywords master_dir last_timing init elements =
let
fun prepare_span st span =
Command_Span.content span
|> Command.read keywords (Command.read_thy st) master_dir init ([], ~1)
|> (fn tr => Toplevel.put_timing (last_timing tr) tr);
fun element_result span_elem (st, _) =
let
val elem = Thy_Syntax.map_element (prepare_span st) span_elem;
val (results, st') = Toplevel.element_result keywords elem st;
val pos' = Toplevel.pos_of (Thy_Syntax.last_element elem);
in (results, (st', pos')) end;
val (results, (end_state, end_pos)) =
fold_map element_result elements (Toplevel.toplevel, Position.none);
val thy = Toplevel.end_theory end_pos end_state;
in (results, thy) end;
fun eval_thy (context: context) update_time master_dir header text_pos text parents =
let
val {options, symbols, bibtex_entries, last_timing} = context;
val (name, _) = #name header;
val keywords =
fold (curry Keyword.merge_keywords o Thy_Header.get_keywords) parents
(Keyword.add_keywords (#keywords header) Keyword.empty_keywords);
val spans = Outer_Syntax.parse_spans (Token.explode keywords text_pos text);
val elements = Thy_Syntax.parse_elements keywords spans;
fun init () =
Resources.begin_theory master_dir header parents
|> Present.begin_theory bibtex_entries update_time
(fn () => implode (map (HTML.present_span symbols keywords) spans));
val (results, thy) =
cond_timeit true ("theory " ^ quote name)
(fn () => excursion keywords master_dir last_timing init elements);
fun present () =
let
val segments = (spans ~~ maps Toplevel.join_results results)
|> map (fn (span, (tr, st')) => {span = span, command = tr, state = st'});
val context: presentation_context =
{options = options, file_pos = text_pos, adjust_pos = I, segments = segments};
in apply_presentation context thy end;
in (thy, present, size text) end;
(* require_thy -- checking database entries wrt. the file-system *)
local
fun required_by _ [] = ""
| required_by s initiators = s ^ "(required by " ^ show_path (rev initiators) ^ ")";
fun load_thy context initiators update_time deps text (name, pos) keywords parents =
let
val _ = remove_thy name;
val _ = writeln ("Loading theory " ^ quote name ^ required_by " " initiators);
val _ = Output.try_protocol_message (Markup.loading_theory name) [];
val {master = (thy_path, _), imports} = deps;
val dir = Path.dir thy_path;
val header = Thy_Header.make (name, pos) imports keywords;
val _ = Position.reports (map #2 imports ~~ map Theory.get_markup parents);
val exec_id = Document_ID.make ();
val _ =
Execution.running Document_ID.none exec_id [] orelse
raise Fail ("Failed to register execution: " ^ Document_ID.print exec_id);
val timing_start = Timing.start ();
val text_pos = Position.put_id (Document_ID.print exec_id) (Path.position thy_path);
val (theory, present, weight) =
eval_thy context update_time dir header text_pos text
(if name = Context.PureN then [Context.the_global_context ()] else parents);
val timing_result = Timing.result timing_start;
val timing_props = [Markup.theory_timing, (Markup.nameN, name)];
val _ = Output.try_protocol_message (timing_props @ Markup.timing_properties timing_result) []
fun commit () = update_thy deps theory;
in
Result {theory = theory, exec_id = exec_id, present = present, commit = commit, weight = weight}
end;
fun check_deps dir name =
(case lookup_deps name of
SOME NONE => (true, NONE, Position.none, get_imports name, [])
| NONE =>
let val {master, text, theory_pos, imports, keywords} = Resources.check_thy dir name
in (false, SOME (make_deps master imports, text), theory_pos, imports, keywords) end
| SOME (SOME {master, ...}) =>
let
val {master = master', text = text', theory_pos = theory_pos', imports = imports',
keywords = keywords'} = Resources.check_thy dir name;
val deps' = SOME (make_deps master' imports', text');
val current =
#2 master = #2 master' andalso
(case lookup_theory name of
NONE => false
| SOME theory => Resources.loaded_files_current theory);
in (current, deps', theory_pos', imports', keywords') end);
in
fun require_thys context initiators qualifier dir strs tasks =
fold_map (require_thy context initiators qualifier dir) strs tasks |>> forall I
and require_thy context initiators qualifier dir (s, require_pos) tasks =
let
val {master_dir, theory_name, ...} = Resources.import_name qualifier dir s;
in
(case try (String_Graph.get_node tasks) theory_name of
SOME task => (task_finished task, tasks)
| NONE =>
let
val _ = member (op =) initiators theory_name andalso error (cycle_msg initiators);
val (current, deps, theory_pos, imports, keywords) = check_deps master_dir theory_name
handle ERROR msg =>
cat_error msg
("The error(s) above occurred for theory " ^ quote theory_name ^
Position.here require_pos ^ required_by "\n" initiators);
val qualifier' = Resources.theory_qualifier theory_name;
val dir' = Path.append dir (master_dir_deps (Option.map #1 deps));
val parents = map (#theory_name o Resources.import_name qualifier' dir' o #1) imports;
val (parents_current, tasks') =
require_thys context (theory_name :: initiators) qualifier' dir' imports tasks;
val all_current = current andalso parents_current;
val task =
if all_current then Finished (get_theory theory_name)
else
(case deps of
NONE => raise Fail "Malformed deps"
| SOME (dep, text) =>
let
val update_time = serial ();
val load =
load_thy context initiators update_time
dep text (theory_name, theory_pos) keywords;
in Task (parents, load) end);
val tasks'' = new_entry theory_name parents task tasks';
in (all_current, tasks'') end)
end;
end;
(* use theories *)
fun use_theories context qualifier master_dir imports =
let val (_, tasks) = require_thys context [] qualifier master_dir imports String_Graph.empty
in if Multithreading.max_threads () > 1 then schedule_futures tasks else schedule_seq tasks end;
fun use_thy name =
use_theories (default_context ()) Resources.default_qualifier
Path.current [(name, Position.none)];
(* toplevel scripting -- without maintaining database *)
fun script_thy pos txt thy =
let
val trs =
Outer_Syntax.parse thy pos txt
|> map (Toplevel.modify_init (K thy));
val end_pos = if null trs then pos else Toplevel.pos_of (List.last trs);
val end_state = fold (Toplevel.command_exception true) trs Toplevel.toplevel;
in Toplevel.end_theory end_pos end_state end;
(* register theory *)
fun register_thy theory =
let
val name = Context.theory_long_name theory;
val {master, ...} = Resources.check_thy (Resources.master_directory theory) name;
val imports = Resources.imports_of theory;
in
change_thys (fn thys =>
let
val thys' = remove name thys;
val _ = writeln ("Registering theory " ^ quote name);
in update (make_deps master imports) theory thys' end)
end;
(* finish all theories *)
fun finish () = change_thys (String_Graph.map (fn _ => fn (_, entry) => (NONE, entry)));
end;
fun use_thy name = Runtime.toplevel_program (fn () => Thy_Info.use_thy name);

View File

@ -175,30 +175,30 @@ end;
(* presentation tokens *)
datatype token =
Ignore_Token
| Basic_Token of Token.T
| Markup_Token of string * Input.source
| Markup_Env_Token of string * Input.source
| Raw_Token of Input.source;
Ignore
| Token of Token.T
| Heading of string * Input.source
| Body of string * Input.source
| Raw of Input.source;
fun basic_token pred (Basic_Token tok) = pred tok
| basic_token _ _ = false;
fun token_with pred (Token tok) = pred tok
| token_with _ _ = false;
val white_token = basic_token Document_Source.is_white;
val white_comment_token = basic_token Document_Source.is_white_comment;
val blank_token = basic_token Token.is_blank;
val newline_token = basic_token Token.is_newline;
val white_token = token_with Document_Source.is_white;
val white_comment_token = token_with Document_Source.is_white_comment;
val blank_token = token_with Token.is_blank;
val newline_token = token_with Token.is_newline;
fun present_token ctxt tok =
(case tok of
Ignore_Token => []
| Basic_Token tok => output_token ctxt tok
| Markup_Token (cmd, source) =>
Ignore => []
| Token tok => output_token ctxt tok
| Heading (cmd, source) =>
Latex.enclose_body ("%\n\\isamarkup" ^ cmd ^ "{") "%\n}\n"
(output_document ctxt {markdown = false} source)
| Markup_Env_Token (cmd, source) =>
| Body (cmd, source) =>
[Latex.environment_block ("isamarkup" ^ cmd) (output_document ctxt {markdown = true} source)]
| Raw_Token source =>
| Raw source =>
Latex.string "%\n" :: output_document ctxt {markdown = true} source @ [Latex.string "\n"]);
@ -367,7 +367,7 @@ fun present_thy options thy (segments: segment list) =
(* tokens *)
val ignored = Scan.state --| ignore
>> (fn d => (NONE, (Ignore_Token, ("", d))));
>> (fn d => (NONE, (Ignore, ("", d))));
fun markup pred mk flag = Scan.peek (fn d =>
Document_Source.improper |--
@ -384,29 +384,29 @@ fun present_thy options thy (segments: segment list) =
Scan.optional (Scan.one Token.is_command_modifier ::: Document_Source.improper) [] --
Scan.one Token.is_command --| Document_Source.annotation
>> (fn (cmd_mod, cmd) =>
map (fn tok => (NONE, (Basic_Token tok, ("", d)))) cmd_mod @
map (fn tok => (NONE, (Token tok, ("", d)))) cmd_mod @
[(SOME (Token.content_of cmd, Token.pos_of cmd),
(Basic_Token cmd, (markup_false, d)))]));
(Token cmd, (markup_false, d)))]));
val cmt = Scan.peek (fn d =>
Scan.one Document_Source.is_black_comment >> (fn tok => (NONE, (Basic_Token tok, ("", d)))));
Scan.one Document_Source.is_black_comment >> (fn tok => (NONE, (Token tok, ("", d)))));
val other = Scan.peek (fn d =>
Parse.not_eof >> (fn tok => (NONE, (Basic_Token tok, ("", d)))));
Parse.not_eof >> (fn tok => (NONE, (Token tok, ("", d)))));
val tokens =
(ignored ||
markup Keyword.is_document_heading Markup_Token markup_true ||
markup Keyword.is_document_body Markup_Env_Token markup_true ||
markup Keyword.is_document_raw (Raw_Token o #2) "") >> single ||
markup Keyword.is_document_heading Heading markup_true ||
markup Keyword.is_document_body Body markup_true ||
markup Keyword.is_document_raw (Raw o #2) "") >> single ||
command ||
(cmt || other) >> single;
(* spans *)
val is_eof = fn (_, (Basic_Token x, _)) => Token.is_eof x | _ => false;
val stopper = Scan.stopper (K (NONE, (Basic_Token Token.eof, ("", 0)))) is_eof;
val is_eof = fn (_, (Token x, _)) => Token.is_eof x | _ => false;
val stopper = Scan.stopper (K (NONE, (Token Token.eof, ("", 0)))) is_eof;
val cmd = Scan.one (is_some o fst);
val non_cmd = Scan.one (is_none o fst andf not o is_eof) >> #2;
@ -466,7 +466,7 @@ end;
(* pretty printing *)
fun pretty_term ctxt t =
Syntax.pretty_term (Variable.auto_fixes t ctxt) t;
Syntax.pretty_term (Proof_Context.augment t ctxt) t;
fun pretty_thm ctxt = pretty_term ctxt o Thm.full_prop_of;

View File

@ -2,6 +2,8 @@
Author: Makarius
Theory document output.
This is a modified/patched version that supports Isabelle/DOF.
*)
signature THY_OUTPUT =
@ -176,32 +178,32 @@ end;
(* presentation tokens *)
datatype token =
Ignore_Token
| Basic_Token of Token.T
| Markup_Token of string * string * Input.source
| Markup_Env_Token of string * string * Input.source
| Raw_Token of Input.source;
Ignore
| Token of Token.T
| Heading of string * string * Input.source
| Body of string * string * Input.source
| Raw of Input.source;
fun basic_token pred (Basic_Token tok) = pred tok
| basic_token _ _ = false;
fun token_with pred (Token tok) = pred tok
| token_with _ _ = false;
val white_token = basic_token Document_Source.is_white;
val white_comment_token = basic_token Document_Source.is_white_comment;
val blank_token = basic_token Token.is_blank;
val newline_token = basic_token Token.is_newline;
val white_token = token_with Document_Source.is_white;
val white_comment_token = token_with Document_Source.is_white_comment;
val blank_token = token_with Token.is_blank;
val newline_token = token_with Token.is_newline;
fun present_token ctxt tok =
(case tok of
Ignore_Token => []
| Basic_Token tok => output_token ctxt tok
| Markup_Token (cmd, meta_args, source) =>
Ignore => []
| Token tok => output_token ctxt tok
| Heading (cmd, meta_args, source) =>
Latex.enclose_body ("%\n\\isamarkup" ^ cmd (* ^ meta_args *) ^ "{") "%\n}\n"
(output_document ctxt {markdown = false} source)
| Markup_Env_Token (cmd, meta_args, source) =>
| Body (cmd, meta_args, source) =>
[Latex.environment_block
("isamarkup" ^ cmd (* ^ meta_args*))
(Latex.string meta_args :: output_document ctxt {markdown = true} source)]
| Raw_Token source =>
| Raw source =>
Latex.string "%\n" :: output_document ctxt {markdown = true} source @ [Latex.string "\n"]);
@ -376,7 +378,7 @@ fun present_thy options thy (segments: segment list) =
(* tokens *)
val ignored = Scan.state --| ignore
>> (fn d => (NONE, (Ignore_Token, ("", d))));
>> (fn d => (NONE, (Ignore, ("", d))));
fun markup pred mk flag = Scan.peek (fn d =>
Document_Source.improper
@ -399,29 +401,29 @@ fun present_thy options thy (segments: segment list) =
Scan.optional (Scan.one Token.is_command_modifier ::: Document_Source.improper) [] --
Scan.one Token.is_command --| Document_Source.annotation
>> (fn (cmd_mod, cmd) =>
map (fn tok => (NONE, (Basic_Token tok, ("", d)))) cmd_mod @
map (fn tok => (NONE, (Token tok, ("", d)))) cmd_mod @
[(SOME (Token.content_of cmd, Token.pos_of cmd),
(Basic_Token cmd, (markup_false, d)))]));
(Token cmd, (markup_false, d)))]));
val cmt = Scan.peek (fn d =>
Scan.one Document_Source.is_black_comment >> (fn tok => (NONE, (Basic_Token tok, ("", d)))));
Scan.one Document_Source.is_black_comment >> (fn tok => (NONE, (Token tok, ("", d)))));
val other = Scan.peek (fn d =>
Parse.not_eof >> (fn tok => (NONE, (Basic_Token tok, ("", d)))));
Parse.not_eof >> (fn tok => (NONE, (Token tok, ("", d)))));
val tokens =
(ignored ||
markup Keyword.is_document_heading Markup_Token markup_true ||
markup Keyword.is_document_body Markup_Env_Token markup_true ||
markup Keyword.is_document_raw (Raw_Token o #3) "") >> single ||
markup Keyword.is_document_heading Heading markup_true ||
markup Keyword.is_document_body Body markup_true ||
markup Keyword.is_document_raw (Raw o #3) "") >> single ||
command ||
(cmt || other) >> single;
(* spans *)
val is_eof = fn (_, (Basic_Token x, _)) => Token.is_eof x | _ => false;
val stopper = Scan.stopper (K (NONE, (Basic_Token Token.eof, ("", 0)))) is_eof;
val is_eof = fn (_, (Token x, _)) => Token.is_eof x | _ => false;
val stopper = Scan.stopper (K (NONE, (Token Token.eof, ("", 0)))) is_eof;
val cmd = Scan.one (is_some o fst);
val non_cmd = Scan.one (is_none o fst andf not o is_eof) >> #2;
@ -488,7 +490,7 @@ end;
(* pretty printing *)
fun pretty_term ctxt t =
Syntax.pretty_term (Variable.auto_fixes t ctxt) t;
Syntax.pretty_term (Proof_Context.augment t ctxt) t;
fun pretty_thm ctxt = pretty_term ctxt o Thm.full_prop_of;

View File

@ -14,8 +14,8 @@
theory
AssnsLemmaThmEtc
imports
"../ontologies/Conceptual/Conceptual"
"../ontologies/math_paper/math_paper"
"Isabelle_DOF.Conceptual"
"Isabelle_DOF.math_paper"
begin
section\<open>Elementary Creation of Doc-items and Access of their Attibutes\<close>

View File

@ -14,7 +14,7 @@
theory
Attributes
imports
"../ontologies/Conceptual/Conceptual"
"Isabelle_DOF.Conceptual"
begin
section\<open>Elementary Creation of Doc-items and Access of their Attibutes\<close>

View File

@ -16,10 +16,10 @@ chapter\<open>Setting and modifying attributes of doc-items\<close>
theory
Concept_Example
imports
"../ontologies/Conceptual/Conceptual" (* we use the generic "Conceptual" ontology *)
"Isabelle_DOF.Conceptual" (* we use the generic "Conceptual" ontology *)
begin
text\<open>@{theory "Isabelle_DOF-tests.Conceptual"} provides a monitor @{typ M} enforcing a particular document structure.
text\<open>@{theory "Isabelle_DOF.Conceptual"} provides a monitor @{typ M} enforcing a particular document structure.
Here, we say: From now on, this structural rules are respected wrt. all doc\_classes M is
enabled for.\<close>
open_monitor*[struct::M]

View File

@ -16,7 +16,7 @@ chapter\<open>Setting and modifying attributes of doc-items\<close>
theory
Concept_ExampleInvariant
imports
"../ontologies/Conceptual/Conceptual" (* we use the generic "Conceptual" ontology *)
"Isabelle_DOF.Conceptual" (* we use the generic "Conceptual" ontology *)
begin
section\<open>Example: Standard Class Invariant\<close>

View File

@ -16,7 +16,7 @@ chapter\<open>Inner Syntax Antiquotations (ISA)'s\<close>
theory
InnerSyntaxAntiquotations
imports
"../ontologies/Conceptual/Conceptual"
"Isabelle_DOF.Conceptual"
begin
text\<open>Since the syntax chosen for values of doc-class attributes is HOL-syntax --- requiring

8
src/tests/ROOT Normal file
View File

@ -0,0 +1,8 @@
session "Isabelle_DOF-tests" = "Isabelle_DOF" +
options [document = false]
theories
"AssnsLemmaThmEtc"
"Concept_ExampleInvariant"
"Concept_Example"
"InnerSyntaxAntiquotations"
"Attributes"