Reorganization: Pushed Macro Core Mechanism into the DOF Core; adapted the RefMan accordingly.
This commit is contained in:
parent
2f721d0f4b
commit
04f0cc7f5c
|
@ -445,6 +445,7 @@ doc_class "theorem" = math_content +
|
||||||
mcc :: "math_content_class" <= "thm" ...
|
mcc :: "math_content_class" <= "thm" ...
|
||||||
\<close>}\<close>
|
\<close>}\<close>
|
||||||
|
|
||||||
|
|
||||||
text\<open>The class \<^verbatim>\<open>technical\<close> regroups a number of text-elements that contain typical
|
text\<open>The class \<^verbatim>\<open>technical\<close> regroups a number of text-elements that contain typical
|
||||||
"technical content" in mathematical or engineering papers: code, definitions, theorems,
|
"technical content" in mathematical or engineering papers: code, definitions, theorems,
|
||||||
lemmas, examples. From this class, the more stricter class of @{typ \<open>math_content\<close>} is derived,
|
lemmas, examples. From this class, the more stricter class of @{typ \<open>math_content\<close>} is derived,
|
||||||
|
|
|
@ -18,35 +18,79 @@ theory
|
||||||
"03_GuidedTour"
|
"03_GuidedTour"
|
||||||
"Isabelle_DOF.Isa_COL"
|
"Isabelle_DOF.Isa_COL"
|
||||||
begin
|
begin
|
||||||
|
|
||||||
|
declare_reference*[infrastructure::technical]
|
||||||
|
|
||||||
(*>*)
|
(*>*)
|
||||||
|
|
||||||
chapter*[isadof_ontologies::technical]\<open>Developing Ontologies\<close>
|
chapter*[isadof_ontologies::technical]\<open>Ontologies and their Development\<close>
|
||||||
|
|
||||||
text\<open>
|
text\<open>
|
||||||
In this chapter, we explain the concepts for modeling new ontologies, developing a document
|
In this chapter, we explain the concepts of \<^isadof> in a more systematic way, and give
|
||||||
|
guidelines for modeling new ontologies, the concepts developing a document
|
||||||
representation for them, as well as developing new document templates.
|
representation for them, as well as developing new document templates.
|
||||||
\<close>
|
|
||||||
|
|
||||||
section*[infrastructure::technical]\<open>Overview and Technical Infrastructure\<close>
|
|
||||||
text\<open>
|
|
||||||
\<^isadof> is embedded in the underlying generic document model of Isabelle as described in
|
\<^isadof> is embedded in the underlying generic document model of Isabelle as described in
|
||||||
\<^introduction>\<open>dof\<close>. Recall that the document language can be extended dynamically, \<^ie>, new
|
\<^introduction>\<open>dof\<close>. Recall that the document language can be extended dynamically, \<^ie>, new
|
||||||
\<open>user-defined\<close> can be introduced at run-time. This is similar to the definition of new functions
|
\<open>user-defined\<close> can be introduced at run-time. This is similar to the definition of new functions
|
||||||
in an interpreter. \<^isadof> as a system plugin is is a number of new command definitions in
|
in an interpreter. \<^isadof> as a system plugin is is a number of new command definitions in
|
||||||
Isabelle's document model.
|
Isabelle's document model.
|
||||||
|
|
||||||
\<^isadof> consists consists basically of four components:
|
\<^isadof> consists consists basically of five components:
|
||||||
\<^item> an own \<^emph>\<open>family of text-elements\<close> such as \<^boxed_theory_text>\<open>title*\<close>, \<^boxed_theory_text>\<open>chapter*\<close>
|
\<^item> the \<^emph>\<open>DOF-core\<close>, which provides an own \<^emph>\<open>family of commands\<close> such as
|
||||||
\<^boxed_theory_text>\<open>text*\<close>, etc., which can be annotated with meta-information defined in the
|
\<^boxed_theory_text>\<open>text*\<close>, \<^boxed_theory_text>\<open>declare_reference*\<close>,
|
||||||
underlying ontology definition and allow to build a \<^emph>\<open>core\<close> document,
|
\<^boxed_theory_text>\<open>update_instance*\<close>, \<^boxed_theory_text>\<open>open_monitor*\<close>, etc.
|
||||||
\<^item> the \<^emph>\<open>ontology definition language\<close> (called ODL) which allow for the definitions
|
They allow to annotate text-elements with meta-information defined in an
|
||||||
of document-classes and necessary auxiliary datatypes,
|
underlying ontology,
|
||||||
|
\<^item> the \<^emph>\<open>DOF-core\<close> also provides the \<^emph>\<open>ontology definition language\<close> (called ODL)
|
||||||
|
which allow for the definitions of document-classes and necessary auxiliary datatypes,
|
||||||
|
\<^item> the \<^isadof> library of ontologies providing ontological concepts as well
|
||||||
|
as supporting infrastructure,
|
||||||
\<^item> an infrastructure for ontology-specific \<^emph>\<open>layout definitions\<close>, exploiting this meta-information,
|
\<^item> an infrastructure for ontology-specific \<^emph>\<open>layout definitions\<close>, exploiting this meta-information,
|
||||||
and
|
and
|
||||||
\<^item> an infrastructure for generic \<^emph>\<open>layout definitions\<close> for documents following, \<^eg>, the format
|
\<^item> an infrastructure for generic \<^emph>\<open>layout definitions\<close> for documents following, \<^eg>, the format
|
||||||
guidelines of publishers or standardization bodies.
|
guidelines of publishers or standardization bodies.
|
||||||
|
|
||||||
|
Similarly to Isabelle, which is based on a core logic \<^theory>\<open>Pure\<close> and then extended by libraries
|
||||||
|
to major systems like \<^verbatim>\<open>HOL\<close> or \<^verbatim>\<open>FOL\<close>, \<^isadof> has a generic core infrastructure \<^dof> and then
|
||||||
|
presents itself to users via major library extensions, which add domain-specific
|
||||||
|
system-extensions. Consequently, ontologies in \<^isadof> are not just a sequence of descriptions in
|
||||||
|
\<^isadof>'s Ontology Definition Language (ODL). Rather, they are integrated documents themselves that
|
||||||
|
provide textual decriptions, abbreviations, macro-support and even ML-code.
|
||||||
|
Conceptually, the library of \<^isadof> is currently organized as follows
|
||||||
|
\<^footnote>\<open>Note that the \<^emph>\<open>technical\<close> organisation is slightly different and shown in
|
||||||
|
@{technical (unchecked) \<open>infrastructure\<close>}.\<close>:
|
||||||
|
%
|
||||||
|
\begin{center}
|
||||||
|
\begin{minipage}{.9\textwidth}
|
||||||
|
\dirtree{%
|
||||||
|
.1 COL\DTcomment{The Common Ontology Library}.
|
||||||
|
.2 scholarly\_paper\DTcomment{Scientific Papers}.
|
||||||
|
.3 technical\_report\DTcomment{Extended Papers}.
|
||||||
|
.4 CENELEC\_50128\DTcomment{Papers according to CENELEC\_50128}.
|
||||||
|
.4 CC\_v3\_1\_R5\DTcomment{Papers according to Common Criteria}.
|
||||||
|
.4 \ldots.
|
||||||
|
}
|
||||||
|
\end{minipage}
|
||||||
|
\end{center}
|
||||||
|
|
||||||
|
These libraries not only provide ontological concepts, but also syntactic sugar in Isabelle's
|
||||||
|
command language Isar that is of major importance for users (and may be felt as \<^isadof> key
|
||||||
|
features by many authors). In reality,
|
||||||
|
they are derived concepts from more generic ones; for example, the commands
|
||||||
|
\<^boxed_theory_text>\<open>title*\<close>, \<^boxed_theory_text>\<open>section*\<close>, \<^boxed_theory_text>\<open>subsection*\<close>, \<^etc>,
|
||||||
|
are in reality a kind of macros for \<^boxed_theory_text>\<open>text*[<label>::title]...\<close>,
|
||||||
|
\<^boxed_theory_text>\<open>text*[<label>::section]...\<close>, respectively.
|
||||||
|
These example commands are defined in the COL.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
|
|
||||||
|
section*[infrastructure::technical]\<open>Technical Infrastructure\<close>
|
||||||
|
|
||||||
|
|
||||||
text\<open>
|
text\<open>
|
||||||
The list of fully supported (\<^ie>, supporting both interactive ontological modeling and
|
The list of fully supported (\<^ie>, supporting both interactive ontological modeling and
|
||||||
document generation) ontologies and the list of supported document templates can be
|
document generation) ontologies and the list of supported document templates can be
|
||||||
|
@ -74,6 +118,7 @@ text\<open>
|
||||||
|
|
||||||
Technically, ontologies\<^index>\<open>ontology!directory structure\<close> are stored in a directory
|
Technically, ontologies\<^index>\<open>ontology!directory structure\<close> are stored in a directory
|
||||||
\inlinebash|src/ontologies| and consist of a Isabelle theory file and a \<^LaTeX> -style file:
|
\inlinebash|src/ontologies| and consist of a Isabelle theory file and a \<^LaTeX> -style file:
|
||||||
|
%
|
||||||
\begin{center}
|
\begin{center}
|
||||||
\begin{minipage}{.9\textwidth}
|
\begin{minipage}{.9\textwidth}
|
||||||
\dirtree{%
|
\dirtree{%
|
||||||
|
@ -81,12 +126,15 @@ text\<open>
|
||||||
.2 src.
|
.2 src.
|
||||||
.3 ontologies\DTcomment{Ontologies}.
|
.3 ontologies\DTcomment{Ontologies}.
|
||||||
.4 ontologies.thy\DTcomment{Ontology Registration}.
|
.4 ontologies.thy\DTcomment{Ontology Registration}.
|
||||||
.4 CENELEC\_50128\DTcomment{CENELEC\_50128}.
|
|
||||||
.5 CENELEC\_50128.thy.
|
|
||||||
.5 DOF-CENELEC\_50128.sty.
|
|
||||||
.4 scholarly\_paper\DTcomment{scholarly\_paper}.
|
.4 scholarly\_paper\DTcomment{scholarly\_paper}.
|
||||||
.5 scholarly\_paper.thy.
|
.5 scholarly\_paper.thy.
|
||||||
.5 DOF-scholarly\_paper.sty.
|
.5 DOF-scholarly\_paper.sty.
|
||||||
|
.4 technical\_report\DTcomment{technical\_paper}.
|
||||||
|
.5 technical\_report.thy.
|
||||||
|
.5 DOF-technical\_report.sty.
|
||||||
|
.4 CENELEC\_50128\DTcomment{CENELEC\_50128}.
|
||||||
|
.5 CENELEC\_50128.thy.
|
||||||
|
.5 DOF-CENELEC\_50128.sty.
|
||||||
.4 \ldots.
|
.4 \ldots.
|
||||||
}
|
}
|
||||||
\end{minipage}
|
\end{minipage}
|
||||||
|
|
|
@ -23,14 +23,13 @@ text\<open> Building a fundamental infrastructure for common document elements s
|
||||||
|
|
||||||
theory Isa_COL
|
theory Isa_COL
|
||||||
imports Isa_DOF
|
imports Isa_DOF
|
||||||
keywords "title*" "subtitle*" "chapter*"
|
keywords "title*" "subtitle*"
|
||||||
"section*" "subsection*" "subsubsection*"
|
"chapter*" "section*"
|
||||||
"paragraph*" "subparagraph*" :: document_body
|
"subsection*" "subsubsection*"
|
||||||
and "figure*" "side_by_side_figure*" :: document_body
|
"paragraph*" "subparagraph*"
|
||||||
|
"figure*" "side_by_side_figure*" :: document_body
|
||||||
and "assert*" :: thy_decl
|
and "assert*" :: thy_decl
|
||||||
|
|
||||||
and "define_shortcut*" "define_macro*":: thy_decl
|
|
||||||
|
|
||||||
begin
|
begin
|
||||||
|
|
||||||
section\<open>Basic Text and Text-Structuring Elements\<close>
|
section\<open>Basic Text and Text-Structuring Elements\<close>
|
||||||
|
@ -252,201 +251,9 @@ val _ =
|
||||||
end
|
end
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
section\<open>Shortcuts, Macros, Environments\<close>
|
(*<*)
|
||||||
text\<open>The features described in this section are actually \<^emph>\<open>not\<close> real ISADOF features, rather a
|
|
||||||
slightly more abstract layer over somewhat buried standard features of the Isabelle document
|
|
||||||
generator ... (Thanks to Makarius) Conceptually, they are \<^emph>\<open>sub-text-elements\<close>. \<close>
|
|
||||||
|
|
||||||
text\<open>This module provides mechanisms to define front-end checked:
|
|
||||||
\<^enum> \<^emph>\<open>shortcuts\<close>, i.e. machine-checked abbreviations without arguments
|
|
||||||
that were mapped to user-defined LaTeX code (Example: \<^verbatim>\<open>\ie\<close>)
|
|
||||||
\<^enum> \<^emph>\<open>macros\<close> with one argument that were mapped to user-defined code. Example: \<^verbatim>\<open>\myurl{bla}\<close>.
|
|
||||||
The argument can be potentially checked and reports can be sent to PIDE;
|
|
||||||
if no such checking is desired, this can be expressed by setting the
|
|
||||||
\<^theory_text>\<open>reportNtest\<close>-parameter to \<^theory_text>\<open>K(K())\<close>.
|
|
||||||
\<^enum> \<^emph>\<open>macros\<close> with two arguments, potentially independently checked. See above.
|
|
||||||
Example: \<^verbatim>\<open>\myurl[ding]{dong}\<close>,
|
|
||||||
\<^enum> \<^emph>\<open>boxes\<close> which are more complex sub-text-elements in the line of the \<^verbatim>\<open>verbatim\<close> or
|
|
||||||
\<^verbatim>\<open>theory_text\<close> environments.
|
|
||||||
|
|
||||||
Note that we deliberately refrained from a code-template definition mechanism for simplicity,
|
|
||||||
so the patterns were just described by strings. No additional ado with quoting/unquoting
|
|
||||||
mechanisms ...
|
|
||||||
\<close>
|
|
||||||
|
|
||||||
ML\<open>
|
|
||||||
structure DOF_lib =
|
|
||||||
struct
|
|
||||||
fun define_shortcut name latexshcut =
|
|
||||||
Thy_Output.antiquotation_raw name (Scan.succeed ())
|
|
||||||
(fn _ => fn () => Latex.string latexshcut)
|
|
||||||
|
|
||||||
(* This is a generalization of the Isabelle2020 function "control_antiquotation" from
|
|
||||||
document_antiquotations.ML. (Thanks Makarius!) *)
|
|
||||||
fun define_macro name s1 s2 reportNtest =
|
|
||||||
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.cartouche_input)
|
|
||||||
(fn ctxt =>
|
|
||||||
fn src => let val () = reportNtest ctxt src
|
|
||||||
in src |> Latex.enclose_block s1 s2
|
|
||||||
o Thy_Output.output_document ctxt {markdown = false}
|
|
||||||
end);
|
|
||||||
|
|
||||||
local (* hide away really strange local construction *)
|
|
||||||
fun enclose_body2 front body1 middle body2 post =
|
|
||||||
(if front = "" then [] else [Latex.string front]) @ body1 @
|
|
||||||
(if middle = "" then [] else [Latex.string middle]) @ body2 @
|
|
||||||
(if post = "" then [] else [Latex.string post]);
|
|
||||||
in
|
|
||||||
fun define_macro2 name front middle post reportNtest1 reportNtest2 =
|
|
||||||
Thy_Output.antiquotation_raw_embedded name (Scan.lift ( Args.cartouche_input
|
|
||||||
-- Args.cartouche_input))
|
|
||||||
(fn ctxt =>
|
|
||||||
fn (src1,src2) => let val () = reportNtest1 ctxt src1
|
|
||||||
val () = reportNtest2 ctxt src2
|
|
||||||
val T1 = Thy_Output.output_document ctxt {markdown = false} src1
|
|
||||||
val T2 = Thy_Output.output_document ctxt {markdown = false} src2
|
|
||||||
in Latex.block(enclose_body2 front T1 middle T2 post)
|
|
||||||
end);
|
|
||||||
end
|
|
||||||
|
|
||||||
fun report_text ctxt text =
|
|
||||||
let val pos = Input.pos_of text in
|
|
||||||
Context_Position.reports ctxt
|
|
||||||
[(pos, Markup.language_text (Input.is_delimited text)),
|
|
||||||
(pos, Markup.raw_text)]
|
|
||||||
end;
|
|
||||||
|
|
||||||
fun report_theory_text ctxt text =
|
|
||||||
let val keywords = Thy_Header.get_keywords' ctxt;
|
|
||||||
val _ = report_text ctxt text;
|
|
||||||
val _ =
|
|
||||||
Input.source_explode text
|
|
||||||
|> Token.tokenize keywords {strict = true}
|
|
||||||
|> maps (Token.reports keywords)
|
|
||||||
|> Context_Position.reports_text ctxt;
|
|
||||||
in () end
|
|
||||||
|
|
||||||
fun prepare_text ctxt =
|
|
||||||
Input.source_content #> #1 #> Document_Antiquotation.prepare_lines ctxt;
|
|
||||||
(* This also produces indent-expansion and changes space to "\_" and the introduction of "\newline",
|
|
||||||
I believe. Otherwise its in Thy_Output.output_source, the compiler from string to LaTeX.text. *)
|
|
||||||
|
|
||||||
fun string_2_text_antiquotation ctxt text =
|
|
||||||
prepare_text ctxt text
|
|
||||||
|> Thy_Output.output_source ctxt
|
|
||||||
|> Thy_Output.isabelle ctxt
|
|
||||||
|
|
||||||
fun string_2_theory_text_antiquotation ctxt text =
|
|
||||||
let
|
|
||||||
val keywords = Thy_Header.get_keywords' ctxt;
|
|
||||||
in
|
|
||||||
prepare_text ctxt text
|
|
||||||
|> Token.explode0 keywords
|
|
||||||
|> maps (Thy_Output.output_token ctxt)
|
|
||||||
|> Thy_Output.isabelle ctxt
|
|
||||||
end
|
|
||||||
|
|
||||||
fun gen_text_antiquotation name reportNcheck compile =
|
|
||||||
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.text_input)
|
|
||||||
(fn ctxt => fn text:Input.source =>
|
|
||||||
let
|
|
||||||
val _ = reportNcheck ctxt text;
|
|
||||||
in
|
|
||||||
compile ctxt text
|
|
||||||
end);
|
|
||||||
|
|
||||||
fun std_text_antiquotation name (* redefined in these more abstract terms *) =
|
|
||||||
gen_text_antiquotation name report_text string_2_text_antiquotation
|
|
||||||
|
|
||||||
(* should be the same as (2020):
|
|
||||||
fun text_antiquotation name =
|
|
||||||
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.text_input)
|
|
||||||
(fn ctxt => fn text =>
|
|
||||||
let
|
|
||||||
val _ = report_text ctxt text;
|
|
||||||
in
|
|
||||||
prepare_text ctxt text
|
|
||||||
|> Thy_Output.output_source ctxt
|
|
||||||
|> Thy_Output.isabelle ctxt
|
|
||||||
end);
|
|
||||||
*)
|
|
||||||
|
|
||||||
fun std_theory_text_antiquotation name (* redefined in these more abstract terms *) =
|
|
||||||
gen_text_antiquotation name report_theory_text string_2_theory_text_antiquotation
|
|
||||||
|
|
||||||
(* should be the same as (2020):
|
|
||||||
fun theory_text_antiquotation name =
|
|
||||||
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.text_input)
|
|
||||||
(fn ctxt => fn text =>
|
|
||||||
let
|
|
||||||
val keywords = Thy_Header.get_keywords' ctxt;
|
|
||||||
|
|
||||||
val _ = report_text ctxt text;
|
|
||||||
val _ =
|
|
||||||
Input.source_explode text
|
|
||||||
|> Token.tokenize keywords {strict = true}
|
|
||||||
|> maps (Token.reports keywords)
|
|
||||||
|> Context_Position.reports_text ctxt;
|
|
||||||
in
|
|
||||||
prepare_text ctxt text
|
|
||||||
|> Token.explode0 keywords
|
|
||||||
|> maps (Thy_Output.output_token ctxt)
|
|
||||||
|> Thy_Output.isabelle ctxt
|
|
||||||
|> enclose_env ctxt "isarbox"
|
|
||||||
end);
|
|
||||||
*)
|
|
||||||
|
|
||||||
|
|
||||||
fun environment_delim name =
|
|
||||||
("%\n\\begin{" ^ Latex.output_name name ^ "}\n",
|
|
||||||
"\n\\end{" ^ Latex.output_name name ^ "}");
|
|
||||||
|
|
||||||
fun environment_block name = environment_delim name |-> Latex.enclose_body #> Latex.block;
|
|
||||||
|
|
||||||
|
|
||||||
fun enclose_env verbatim ctxt block_env body =
|
|
||||||
if Config.get ctxt Document_Antiquotation.thy_output_display
|
|
||||||
then if verbatim
|
|
||||||
then environment_block block_env [body]
|
|
||||||
else Latex.environment_block block_env [body]
|
|
||||||
else Latex.block ([Latex.string ("\\inline"^block_env ^"{")] @ [body] @ [ Latex.string ("}")]);
|
|
||||||
|
|
||||||
end
|
|
||||||
\<close>
|
|
||||||
|
|
||||||
ML\<open>
|
|
||||||
local
|
|
||||||
val parse_literal = Parse.alt_string || Parse.cartouche
|
|
||||||
val parse_define_shortcut = Parse.binding -- ((\<^keyword>\<open>\<rightleftharpoons>\<close> || \<^keyword>\<open>==\<close>) |-- parse_literal)
|
|
||||||
val define_shortcuts = fold(uncurry DOF_lib.define_shortcut)
|
|
||||||
in
|
|
||||||
val _ = Outer_Syntax.command \<^command_keyword>\<open>define_shortcut*\<close> "define LaTeX shortcut"
|
|
||||||
(Scan.repeat1 parse_define_shortcut >> (Toplevel.theory o define_shortcuts));
|
|
||||||
end
|
|
||||||
\<close>
|
|
||||||
|
|
||||||
ML\<open>
|
|
||||||
val parse_literal = Parse.alt_string || Parse.cartouche
|
|
||||||
val parse_define_shortcut = Parse.binding
|
|
||||||
-- ((\<^keyword>\<open>\<rightleftharpoons>\<close> || \<^keyword>\<open>==\<close>) |-- parse_literal)
|
|
||||||
--|Parse.underscore
|
|
||||||
-- parse_literal
|
|
||||||
-- (Scan.option (\<^keyword>\<open>(\<close> |-- Parse.ML_source --|\<^keyword>\<open>)\<close>))
|
|
||||||
|
|
||||||
fun define_macro (X,NONE) = (uncurry(uncurry(uncurry DOF_lib.define_macro)))(X,K(K()))
|
|
||||||
|define_macro (X,SOME(src:Input.source)) =
|
|
||||||
let val check_code = K(K()) (* hack *)
|
|
||||||
val _ = warning "Checker code support Not Yet Implemented - use ML"
|
|
||||||
in (uncurry(uncurry(uncurry DOF_lib.define_macro)))(X,check_code)
|
|
||||||
end;
|
|
||||||
|
|
||||||
val _ = Outer_Syntax.command \<^command_keyword>\<open>define_macro*\<close> "define LaTeX shortcut"
|
|
||||||
(Scan.repeat1 parse_define_shortcut >> (Toplevel.theory o (fold define_macro)));
|
|
||||||
|
|
||||||
\<close>
|
|
||||||
|
|
||||||
ML\<open>ML_Context.expression\<close>
|
|
||||||
(*
|
(*
|
||||||
|
ML\<open>ML_Context.expression\<close>
|
||||||
fun setup source =
|
fun setup source =
|
||||||
ML_Context.expression (Input.pos_of source)
|
ML_Context.expression (Input.pos_of source)
|
||||||
(ML_Lex.read "Theory.setup (" @ ML_Lex.read_source source @ ML_Lex.read ")")
|
(ML_Lex.read "Theory.setup (" @ ML_Lex.read_source source @ ML_Lex.read ")")
|
||||||
|
@ -454,7 +261,7 @@ fun setup source =
|
||||||
setup\<open>\<close>
|
setup\<open>\<close>
|
||||||
|
|
||||||
*)
|
*)
|
||||||
|
(*>*)
|
||||||
|
|
||||||
section\<open>Tables\<close>
|
section\<open>Tables\<close>
|
||||||
(* TODO ! ! ! *)
|
(* TODO ! ! ! *)
|
||||||
|
|
|
@ -35,17 +35,16 @@ theory Isa_DOF (* Isabelle Document Ontology Framework *)
|
||||||
|
|
||||||
keywords "+=" ":=" "accepts" "rejects" "invariant"
|
keywords "+=" ":=" "accepts" "rejects" "invariant"
|
||||||
|
|
||||||
and "open_monitor*" "close_monitor*" "declare_reference*"
|
and "open_monitor*" "close_monitor*"
|
||||||
"update_instance*" "doc_class" ::thy_decl
|
"declare_reference*" "update_instance*"
|
||||||
|
"doc_class"
|
||||||
|
"define_shortcut*" "define_macro*" :: thy_decl
|
||||||
|
|
||||||
and "text*" "text-macro*" :: document_body
|
and "text*" "text-macro*" :: document_body
|
||||||
|
|
||||||
and "print_doc_classes" "print_doc_items"
|
and "print_doc_classes" "print_doc_items"
|
||||||
"print_doc_class_template" "check_doc_global" :: diag
|
"print_doc_class_template" "check_doc_global" :: diag
|
||||||
|
|
||||||
(* experimental *)
|
|
||||||
and "corrollary*" "proposition*" "lemma*" "theorem*" :: thy_decl
|
|
||||||
(* -- intended replacement of Isar std commands.*)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@ -674,6 +673,21 @@ fun print_doc_classes b ctxt =
|
||||||
writeln "=====================================\n\n\n"
|
writeln "=====================================\n\n\n"
|
||||||
end;
|
end;
|
||||||
|
|
||||||
|
fun print_doc_class_tree ctxt P T =
|
||||||
|
let val {docobj_tab={tab = x, ...},docclass_tab, ...} = get_data ctxt;
|
||||||
|
val class_tab:(string * docclass_struct)list = (Symtab.dest docclass_tab)
|
||||||
|
fun is_class_son X (n, dc:docclass_struct) = (X = #inherits_from dc)
|
||||||
|
fun tree lev ([]:(string * docclass_struct)list) = ""
|
||||||
|
|tree lev ((n,R)::S) = (if P(lev,n)
|
||||||
|
then "."^Int.toString lev^" "^(T n)^"\n"
|
||||||
|
^ (tree(lev + 1)(filter(is_class_son(SOME([],n)))class_tab))
|
||||||
|
else "."^Int.toString lev^" ... \n")
|
||||||
|
^ (tree lev S)
|
||||||
|
val roots = filter(is_class_son NONE) class_tab
|
||||||
|
in tree 0 roots end
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
fun check_doc_global (strict_checking : bool) ctxt =
|
fun check_doc_global (strict_checking : bool) ctxt =
|
||||||
let val {docobj_tab={tab = x, ...}, monitor_tab, ...} = get_data ctxt;
|
let val {docobj_tab={tab = x, ...}, monitor_tab, ...} = get_data ctxt;
|
||||||
val S = map_filter (fn (s,NONE) => SOME s | _ => NONE) (Symtab.dest x)
|
val S = map_filter (fn (s,NONE) => SOME s | _ => NONE) (Symtab.dest x)
|
||||||
|
@ -1473,44 +1487,6 @@ val _ = Thy_Output.set_meta_args_parser
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
ML \<open>
|
|
||||||
local (* dull and dangerous copy from Pure.thy given that these functions are not
|
|
||||||
globally exported. *)
|
|
||||||
|
|
||||||
val long_keyword =
|
|
||||||
Parse_Spec.includes >> K "" ||
|
|
||||||
Parse_Spec.long_statement_keyword;
|
|
||||||
|
|
||||||
val long_statement =
|
|
||||||
Scan.optional (Parse_Spec.opt_thm_name ":" --| Scan.ahead long_keyword) Binding.empty_atts --
|
|
||||||
Scan.optional Parse_Spec.includes [] -- Parse_Spec.long_statement
|
|
||||||
>> (fn ((binding, includes), (elems, concl)) => (true, binding, includes, elems, concl));
|
|
||||||
|
|
||||||
val short_statement =
|
|
||||||
Parse_Spec.statement -- Parse_Spec.if_statement -- Parse.for_fixes
|
|
||||||
>> (fn ((shows, assumes), fixes) =>
|
|
||||||
(false, Binding.empty_atts, [], [Element.Fixes fixes, Element.Assumes assumes],
|
|
||||||
Element.Shows shows));
|
|
||||||
|
|
||||||
fun theorem spec schematic descr =
|
|
||||||
Outer_Syntax.local_theory_to_proof' spec ("state " ^ descr)
|
|
||||||
((ODL_Command_Parser.attributes -- (long_statement || short_statement))
|
|
||||||
>> (fn (_ (* skip *) ,(long, binding, includes, elems, concl)) =>
|
|
||||||
((if schematic then Specification.schematic_theorem_cmd
|
|
||||||
else Specification.theorem_cmd )
|
|
||||||
long Thm.theoremK NONE (K I) binding includes elems concl)));
|
|
||||||
|
|
||||||
in
|
|
||||||
|
|
||||||
(* Half - fake. activates original Isar commands, but skips meta-arguments for the moment. *)
|
|
||||||
(* tendance deprecated - see new scholarly paper setup. *)
|
|
||||||
val _ = theorem @{command_keyword "theorem*"} false "theorem";
|
|
||||||
val _ = theorem @{command_keyword "lemma*"} false "lemma";
|
|
||||||
val _ = theorem @{command_keyword "corrollary*"} false "corollary";
|
|
||||||
val _ = theorem @{command_keyword "proposition*"} false "proposition";
|
|
||||||
|
|
||||||
end\<close>
|
|
||||||
|
|
||||||
|
|
||||||
section\<open> Syntax for Ontological Antiquotations (the '' View'' Part II) \<close>
|
section\<open> Syntax for Ontological Antiquotations (the '' View'' Part II) \<close>
|
||||||
|
|
||||||
|
@ -1848,7 +1824,201 @@ val _ =
|
||||||
end (* struct *)
|
end (* struct *)
|
||||||
\<close>
|
\<close>
|
||||||
|
|
||||||
text\<open>dfgd\<close>
|
|
||||||
|
|
||||||
|
section\<open>Shortcuts, Macros, Environments\<close>
|
||||||
|
text\<open>The features described in this section are actually \<^emph>\<open>not\<close> real ISADOF features, rather a
|
||||||
|
slightly more abstract layer over somewhat buried standard features of the Isabelle document
|
||||||
|
generator ... (Thanks to Makarius) Conceptually, they are \<^emph>\<open>sub-text-elements\<close>. \<close>
|
||||||
|
|
||||||
|
text\<open>This module provides mechanisms to define front-end checked:
|
||||||
|
\<^enum> \<^emph>\<open>shortcuts\<close>, i.e. machine-checked abbreviations without arguments
|
||||||
|
that were mapped to user-defined LaTeX code (Example: \<^verbatim>\<open>\ie\<close>)
|
||||||
|
\<^enum> \<^emph>\<open>macros\<close> with one argument that were mapped to user-defined code. Example: \<^verbatim>\<open>\myurl{bla}\<close>.
|
||||||
|
The argument can be potentially checked and reports can be sent to PIDE;
|
||||||
|
if no such checking is desired, this can be expressed by setting the
|
||||||
|
\<^theory_text>\<open>reportNtest\<close>-parameter to \<^theory_text>\<open>K(K())\<close>.
|
||||||
|
\<^enum> \<^emph>\<open>macros\<close> with two arguments, potentially independently checked. See above.
|
||||||
|
Example: \<^verbatim>\<open>\myurl[ding]{dong}\<close>,
|
||||||
|
\<^enum> \<^emph>\<open>boxes\<close> which are more complex sub-text-elements in the line of the \<^verbatim>\<open>verbatim\<close> or
|
||||||
|
\<^verbatim>\<open>theory_text\<close> environments.
|
||||||
|
|
||||||
|
Note that we deliberately refrained from a code-template definition mechanism for simplicity,
|
||||||
|
so the patterns were just described by strings. No additional ado with quoting/unquoting
|
||||||
|
mechanisms ...
|
||||||
|
\<close>
|
||||||
|
|
||||||
|
ML\<open>
|
||||||
|
structure DOF_lib =
|
||||||
|
struct
|
||||||
|
fun define_shortcut name latexshcut =
|
||||||
|
Thy_Output.antiquotation_raw name (Scan.succeed ())
|
||||||
|
(fn _ => fn () => Latex.string latexshcut)
|
||||||
|
|
||||||
|
(* This is a generalization of the Isabelle2020 function "control_antiquotation" from
|
||||||
|
document_antiquotations.ML. (Thanks Makarius!) *)
|
||||||
|
fun define_macro name s1 s2 reportNtest =
|
||||||
|
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.cartouche_input)
|
||||||
|
(fn ctxt =>
|
||||||
|
fn src => let val () = reportNtest ctxt src
|
||||||
|
in src |> Latex.enclose_block s1 s2
|
||||||
|
o Thy_Output.output_document ctxt {markdown = false}
|
||||||
|
end);
|
||||||
|
|
||||||
|
local (* hide away really strange local construction *)
|
||||||
|
fun enclose_body2 front body1 middle body2 post =
|
||||||
|
(if front = "" then [] else [Latex.string front]) @ body1 @
|
||||||
|
(if middle = "" then [] else [Latex.string middle]) @ body2 @
|
||||||
|
(if post = "" then [] else [Latex.string post]);
|
||||||
|
in
|
||||||
|
fun define_macro2 name front middle post reportNtest1 reportNtest2 =
|
||||||
|
Thy_Output.antiquotation_raw_embedded name (Scan.lift ( Args.cartouche_input
|
||||||
|
-- Args.cartouche_input))
|
||||||
|
(fn ctxt =>
|
||||||
|
fn (src1,src2) => let val () = reportNtest1 ctxt src1
|
||||||
|
val () = reportNtest2 ctxt src2
|
||||||
|
val T1 = Thy_Output.output_document ctxt {markdown = false} src1
|
||||||
|
val T2 = Thy_Output.output_document ctxt {markdown = false} src2
|
||||||
|
in Latex.block(enclose_body2 front T1 middle T2 post)
|
||||||
|
end);
|
||||||
|
end
|
||||||
|
|
||||||
|
fun report_text ctxt text =
|
||||||
|
let val pos = Input.pos_of text in
|
||||||
|
Context_Position.reports ctxt
|
||||||
|
[(pos, Markup.language_text (Input.is_delimited text)),
|
||||||
|
(pos, Markup.raw_text)]
|
||||||
|
end;
|
||||||
|
|
||||||
|
fun report_theory_text ctxt text =
|
||||||
|
let val keywords = Thy_Header.get_keywords' ctxt;
|
||||||
|
val _ = report_text ctxt text;
|
||||||
|
val _ =
|
||||||
|
Input.source_explode text
|
||||||
|
|> Token.tokenize keywords {strict = true}
|
||||||
|
|> maps (Token.reports keywords)
|
||||||
|
|> Context_Position.reports_text ctxt;
|
||||||
|
in () end
|
||||||
|
|
||||||
|
fun prepare_text ctxt =
|
||||||
|
Input.source_content #> #1 #> Document_Antiquotation.prepare_lines ctxt;
|
||||||
|
(* This also produces indent-expansion and changes space to "\_" and the introduction of "\newline",
|
||||||
|
I believe. Otherwise its in Thy_Output.output_source, the compiler from string to LaTeX.text. *)
|
||||||
|
|
||||||
|
fun string_2_text_antiquotation ctxt text =
|
||||||
|
prepare_text ctxt text
|
||||||
|
|> Thy_Output.output_source ctxt
|
||||||
|
|> Thy_Output.isabelle ctxt
|
||||||
|
|
||||||
|
fun string_2_theory_text_antiquotation ctxt text =
|
||||||
|
let
|
||||||
|
val keywords = Thy_Header.get_keywords' ctxt;
|
||||||
|
in
|
||||||
|
prepare_text ctxt text
|
||||||
|
|> Token.explode0 keywords
|
||||||
|
|> maps (Thy_Output.output_token ctxt)
|
||||||
|
|> Thy_Output.isabelle ctxt
|
||||||
|
end
|
||||||
|
|
||||||
|
fun gen_text_antiquotation name reportNcheck compile =
|
||||||
|
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.text_input)
|
||||||
|
(fn ctxt => fn text:Input.source =>
|
||||||
|
let
|
||||||
|
val _ = reportNcheck ctxt text;
|
||||||
|
in
|
||||||
|
compile ctxt text
|
||||||
|
end);
|
||||||
|
|
||||||
|
fun std_text_antiquotation name (* redefined in these more abstract terms *) =
|
||||||
|
gen_text_antiquotation name report_text string_2_text_antiquotation
|
||||||
|
|
||||||
|
(* should be the same as (2020):
|
||||||
|
fun text_antiquotation name =
|
||||||
|
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.text_input)
|
||||||
|
(fn ctxt => fn text =>
|
||||||
|
let
|
||||||
|
val _ = report_text ctxt text;
|
||||||
|
in
|
||||||
|
prepare_text ctxt text
|
||||||
|
|> Thy_Output.output_source ctxt
|
||||||
|
|> Thy_Output.isabelle ctxt
|
||||||
|
end);
|
||||||
|
*)
|
||||||
|
|
||||||
|
fun std_theory_text_antiquotation name (* redefined in these more abstract terms *) =
|
||||||
|
gen_text_antiquotation name report_theory_text string_2_theory_text_antiquotation
|
||||||
|
|
||||||
|
(* should be the same as (2020):
|
||||||
|
fun theory_text_antiquotation name =
|
||||||
|
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.text_input)
|
||||||
|
(fn ctxt => fn text =>
|
||||||
|
let
|
||||||
|
val keywords = Thy_Header.get_keywords' ctxt;
|
||||||
|
|
||||||
|
val _ = report_text ctxt text;
|
||||||
|
val _ =
|
||||||
|
Input.source_explode text
|
||||||
|
|> Token.tokenize keywords {strict = true}
|
||||||
|
|> maps (Token.reports keywords)
|
||||||
|
|> Context_Position.reports_text ctxt;
|
||||||
|
in
|
||||||
|
prepare_text ctxt text
|
||||||
|
|> Token.explode0 keywords
|
||||||
|
|> maps (Thy_Output.output_token ctxt)
|
||||||
|
|> Thy_Output.isabelle ctxt
|
||||||
|
|> enclose_env ctxt "isarbox"
|
||||||
|
end);
|
||||||
|
*)
|
||||||
|
|
||||||
|
|
||||||
|
fun environment_delim name =
|
||||||
|
("%\n\\begin{" ^ Latex.output_name name ^ "}\n",
|
||||||
|
"\n\\end{" ^ Latex.output_name name ^ "}");
|
||||||
|
|
||||||
|
fun environment_block name = environment_delim name |-> Latex.enclose_body #> Latex.block;
|
||||||
|
|
||||||
|
|
||||||
|
fun enclose_env verbatim ctxt block_env body =
|
||||||
|
if Config.get ctxt Document_Antiquotation.thy_output_display
|
||||||
|
then if verbatim
|
||||||
|
then environment_block block_env [body]
|
||||||
|
else Latex.environment_block block_env [body]
|
||||||
|
else Latex.block ([Latex.string ("\\inline"^block_env ^"{")] @ [body] @ [ Latex.string ("}")]);
|
||||||
|
|
||||||
|
end
|
||||||
|
\<close>
|
||||||
|
|
||||||
|
ML\<open>
|
||||||
|
local
|
||||||
|
val parse_literal = Parse.alt_string || Parse.cartouche
|
||||||
|
val parse_define_shortcut = Parse.binding -- ((\<^keyword>\<open>\<rightleftharpoons>\<close> || \<^keyword>\<open>==\<close>) |-- parse_literal)
|
||||||
|
val define_shortcuts = fold(uncurry DOF_lib.define_shortcut)
|
||||||
|
in
|
||||||
|
val _ = Outer_Syntax.command \<^command_keyword>\<open>define_shortcut*\<close> "define LaTeX shortcut"
|
||||||
|
(Scan.repeat1 parse_define_shortcut >> (Toplevel.theory o define_shortcuts));
|
||||||
|
end
|
||||||
|
\<close>
|
||||||
|
|
||||||
|
ML\<open>
|
||||||
|
val parse_literal = Parse.alt_string || Parse.cartouche
|
||||||
|
val parse_define_shortcut = Parse.binding
|
||||||
|
-- ((\<^keyword>\<open>\<rightleftharpoons>\<close> || \<^keyword>\<open>==\<close>) |-- parse_literal)
|
||||||
|
--|Parse.underscore
|
||||||
|
-- parse_literal
|
||||||
|
-- (Scan.option (\<^keyword>\<open>(\<close> |-- Parse.ML_source --|\<^keyword>\<open>)\<close>))
|
||||||
|
|
||||||
|
fun define_macro (X,NONE) = (uncurry(uncurry(uncurry DOF_lib.define_macro)))(X,K(K()))
|
||||||
|
|define_macro (X,SOME(src:Input.source)) =
|
||||||
|
let val check_code = K(K()) (* hack *)
|
||||||
|
val _ = warning "Checker code support Not Yet Implemented - use ML"
|
||||||
|
in (uncurry(uncurry(uncurry DOF_lib.define_macro)))(X,check_code)
|
||||||
|
end;
|
||||||
|
|
||||||
|
val _ = Outer_Syntax.command \<^command_keyword>\<open>define_macro*\<close> "define LaTeX shortcut"
|
||||||
|
(Scan.repeat1 parse_define_shortcut >> (Toplevel.theory o (fold define_macro)));
|
||||||
|
|
||||||
|
\<close>
|
||||||
|
|
||||||
(*
|
(*
|
||||||
ML\<open>
|
ML\<open>
|
||||||
Pretty.text;
|
Pretty.text;
|
||||||
|
|
Loading…
Reference in New Issue