Resolved merge conflict.

This commit is contained in:
Achim D. Brucker 2022-03-20 20:49:46 +00:00
commit 2314b2191f
18 changed files with 737 additions and 254 deletions

255
NOTES.thy Normal file
View File

@ -0,0 +1,255 @@
chapter \<open>Notes on Isabelle/DOF for Isabelle2021-1\<close>
theory NOTES
imports Main
begin
section \<open>Isabelle/DOF component setup\<close>
subsection \<open>Terminology\<close>
text \<open>
\<^item> The concept of \<^emph>\<open>Isabelle system component\<close> is explained in \<^doc>\<open>system\<close>
section 1.1.1; see also \<^tool>\<open>components\<close> explained in section 7.3.
For example:
\<^verbatim>\<open>isabelle components -l\<close>
\<^item> In the private terminology of Burkhart, the word "component" has a
different meaning: a tool implemented in Isabelle/ML that happens to
declare context data (many ML tools do that, it is not very special,
similar to defining a \<^verbatim>\<open>datatype\<close> or \<^verbatim>\<open>structure\<close> in ML).
\<close>
subsection \<open>Isabelle/DOF as component\<close>
text \<open>
Formal Isabelle/DOF component setup happens here:
\<^item> \<^path>\<open>$ISABELLE_HOME_USER/etc/components\<close>
\<^item> A suitable directory entry in the file registers our component to
existing Isabelle installation; it also activates the session
directory tree starting at \<^file>\<open>$ISABELLE_DOF_HOME/ROOTS\<close>.
\<^item> Alternative: use \<^path>\<open>$ISABELLE_HOME/Admin/build_release\<close> with
option \<^verbatim>\<open>-c\<close> to produce a derived Isabelle distribution that bundles
our component for end-users (maybe even with AFP entries).
\<^item> \<^file>\<open>$ISABELLE_DOF_HOME/etc/settings\<close>
\<^item> This provides a pervasive Bash process environment (variables,
shell functions etc.). It may refer to \<^verbatim>\<open>$COMPONENT\<close> for the
component root, e.g. to retain it in variable \<^dir>\<open>$ISABELLE_DOF_HOME\<close>.
\<^item> Historically, it used to be the main configuration area, but today
we see more and more alternatives, e.g. system options or services in
Isabelle/Scala (see below).
\<^item> \<^file>\<open>$ISABELLE_DOF_HOME/etc/options\<close>
\<^item> options declared as \<^verbatim>\<open>public\<close> appear in the Isabelle/jEdit dialog
\<^action>\<open>plugin-options\<close> (according to their \<^verbatim>\<open>section\<close>)
\<^item> all options (public and non-public) are available for command-line
usage, e.g. \<^verbatim>\<open>isabelle build -o dof_url="..."\<close>
\<^item> access to options in Isabelle/ML:
\<^item> implicit (for the running ML session)
\<^ML>\<open>Options.default_string \<^system_option>\<open>dof_url\<close>\<close>
\<^item> explicit (e.g. for each theories section in
\<^file>\<open>$ISABELLE_HOME/src/Pure/Tools/build.ML\<close>):
\<^ML>\<open>fn options => Options.string options \<^system_option>\<open>dof_url\<close>\<close>
\<^item> access in Isabelle/Scala is always explicit; the initial options
should be created only once and passed around as explicit argument:
\<^scala>\<open>{
val options = isabelle.Options.init();
options.string("dof_url");
}\<close>
Note: there are no antiquotations in Isabelle/Scala, so the literal
string \<^scala>\<open>"dof_url"\<close> is unchecked.
\<close>
section \<open>Document preparation in Isabelle/ML\<close>
subsection \<open>Session presentation hook\<close>
text \<open>
\<^ML>\<open>Build.add_hook\<close> allows to install a global session presentation
hook. It is used e.g. in Isabelle/Mirabelle to analyze all loaded
theories via Sledgehammer and other tools. Isabelle/DOF could use it to
"wrap-up" the whole session, to check if all document constraints hold
(cf. "monitors").
\<close>
subsection \<open>Theory presentation hook\<close>
text \<open>
\<^ML>\<open>Thy_Info.add_presentation\<close> installs a hook to be invoked at the
end of successful loading of theories; the provided context
\<^ML_type>\<open>Thy_Info.presentation_context\<close> provides access to
\<^ML_type>\<open>Options.T\<close> and \<^ML_type>\<open>Document_Output.segment\<close> with
command-spans and semantic states.
An example is regular Latex output in
\<^file>\<open>$ISABELLE_HOME/src/Pure/Thy/thy_info.ML\<close> where \<^ML>\<open>Export.export\<close>
is used to produce export artifacts in the session build database, for
retrieval via Isabelle/Scala.
\<close>
subsection \<open>Document commands\<close>
text \<open>
Isar toplevel commands now support a uniform concept for
\<^ML_type>\<open>Toplevel.presentation\<close>, but the exported interfaces are
limited to commands that do not change the semantic state: see
\<^ML>\<open>Toplevel.present\<close> and \<^ML>\<open>Toplevel.present_local_theory\<close>.
Since \<^verbatim>\<open>Toplevel.present_theory\<close> is missing in Isabelle2021-1, we use a
workaround with an alternative presentation hook: it exports
\<^verbatim>\<open>document/latex_dof\<close> files instead of regular \<^verbatim>\<open>document/latex_dof\<close>.
\<close>
subsection \<open>Document content\<close>
text \<open>
XML is now used uniformly (sometimes as inlined YXML). The meaning of
markup elements and properties is defined in
\<^scala_type>\<open>isabelle.Latex.Output\<close> (or user-defined subclasses).
\<close>
section \<open>Isabelle/Scala services\<close>
subsection \<open>Isabelle/DOF/Scala module\<close>
text \<open>
\<^item> \<^file>\<open>$ISABELLE_DOF_HOME/etc/build.props\<close> is the main specification for
the Isabelle/DOF/Scala module. It is built on the spot as required, e.g.
for \<^verbatim>\<open>isabelle scala\<close> or \<^verbatim>\<open>isabelle jedit\<close>; an alternative is to invoke
\<^verbatim>\<open>isabelle scala_build\<close> manually. See also \<^doc>\<open>system\<close>, chapter 5,
especially section 5.2.
\<^item> \<^verbatim>\<open>isabelle scala_project\<close> helps to develop Isabelle/Scala tools with
proper IDE support, notably IntelliJ IDEA: the generated project uses
Maven. See also \<^doc>\<open>system\<close>, section 5.2.3.
\<^item> Command-line tools should be always implemented in Scala; old-fashioned
shell scripts are no longer required (and more difficult to implement
properly). Only a few low-level tools are outside the Scala environment,
e.g. \<^verbatim>\<open>isabelle getenv\<close>. Add-on components should always use a name
prefix for their tools, e.g. \<^verbatim>\<open>isabelle dof_mkroot\<close> as defined in
\<^file>\<open>$ISABELLE_DOF_HOME/src/scala/dof_mkroot.scala\<close>.
\<close>
subsection \<open>Document preparation\<close>
text \<open>
\<^item> \<^scala_type>\<open>isabelle.Document_Build.Engine\<close> is the main entry-point
for user-defined document preparation; existing templates and examples
are defined in the same module \<^file>\<open>~~/src/Pure/Thy/document_build.scala\<close>.
There are two stages:
\<^enum> \<^verbatim>\<open>prepare_directory\<close>: populate the document output directory (e.g.
copy static document files, collect generated document sources from the
session build database).
\<^enum> \<^verbatim>\<open>build_document\<close>: produce the final PDF within the document output
directory (e.g. via standard LaTeX tools).
See also \<^system_option>\<open>document_build\<close> as explained in \<^doc>\<open>system\<close>,
section 3.3.
\<close>
section \<open>Miscellaneous NEWS and Notes\<close>
text \<open>
\<^item> Document preparation: there are many new options etc. that might help
to fine-tune DOF output, e.g. \<^system_option>\<open>document_comment_latex\<close>.
\<^item> ML: Theory_Data / Generic_Data: "val extend = I" has been removed;
obsolete since Isabelle2021.
\<^item> ML: \<^ML>\<open>Thm.instantiate\<close> and related operations now use explicit
abstract types for the instantiate, see \<^file>\<open>~~/src/Pure/term_items.ML\<close>
\<^item> ML: new antiquotation "instantiate" allows to instantiate formal
entities (types, terms, theorems) with values given ML. For example:
\<^ML>\<open>fn (A, B) =>
\<^instantiate>\<open>'a = A and 'b = B in typ \<open>('a \<times> 'b) list\<close>\<close>\<close>
\<^ML>\<open>fn A =>
\<^instantiate>\<open>'a = A in
lemma (schematic) \<open>x = y \<Longrightarrow> y = x\<close> for x y :: 'a by simp\<close>\<close>
\<^item> ML: new antiquotations for type constructors and term constants. For
example:
\<^ML>\<open>\<^Type>\<open>nat\<close>\<close>
\<^ML>\<open>fn (A, B) => \<^Type>\<open>fun A B\<close>\<close>
\<^ML>\<open>\<^Type_fn>\<open>fun A B => \<open>(A, B)\<close>\<close>\<close>
\<^ML>\<open>fn (A, B) => \<^Const>\<open>conj for A B\<close>\<close>
\<^ML>\<open>\<^Const_fn>\<open>conj for A B => \<open>(A, B)\<close>\<close>\<close>
\<^ML>\<open>fn t =>
case t of
\<^Const_>\<open>plus T for x y\<close> => ("+", T, x, y)
| \<^Const_>\<open>minus T for x y\<close> => ("-", T, x, y)
| \<^Const_>\<open>times T for x y\<close> => ("*", T, x, y)\<close>
Note: do not use unchecked things like
\<^ML>\<open>Const ("List.list.Nil", Type ("Nat.nat", []))\<close>
\<^item> ML: antiquotations "try" and "can" operate directly on the given ML
expression, in contrast to functions "try" and "can" that modify
application of a function.
Note: instead of semantically ill-defined "handle _ => ...", use
something like this:
\<^ML>\<open>
fn (x, y) =>
(case \<^try>\<open>x div y\<close> of
SOME z => z
| NONE => 0)
\<close>
\<^ML>\<open>
fn (x, y) => \<^try>\<open>x div y\<close> |> the_default 0
\<close>
\<close>
text \<open>Adhoc examples:\<close>
ML \<open>
fun mk_plus x y = \<^Const>\<open>plus \<^Type>\<open>nat\<close> for x y\<close>;
fn \<^Const_>\<open>plus \<^Type>\<open>nat\<close> for \<^Const_>\<open>Groups.zero \<^Type>\<open>nat\<close>\<close> y\<close> => y;
\<close>
ML \<open>
fn (A, B) =>
\<^instantiate>\<open>A and B in term \<open>A \<and> B \<longrightarrow> B \<and> A\<close>\<close>;
fn (A, B) =>
\<^instantiate>\<open>A and B in lemma \<open>A \<and> B \<longrightarrow> B \<and> A\<close> by simp\<close>;
\<close>
end
(* :maxLineLen=75: *)

12
etc/build.props Normal file
View File

@ -0,0 +1,12 @@
title = Isabelle/DOF
module = $ISABELLE_HOME_USER/DOF/isabelle_dof.jar
no_build = false
requirements = \
env:ISABELLE_SCALA_JAR
sources = \
src/scala/dof_document_build.scala \
src/scala/dof_mkroot.scala \
src/scala/dof_tools.scala
services = \
isabelle_dof.DOF_Tools \
isabelle_dof.DOF_Document_Build$Engine

12
etc/options Normal file
View File

@ -0,0 +1,12 @@
(* :mode=isabelle-options: *)
section "Isabelle/DOF"
public option dof_url : string = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF"
-- "Isabelle/DOF source repository"
option dof_artifact_dir : string = "releases/Isabelle_DOF/Isabelle_DOF"
-- "Isabelle/DOF release artifacts"
option dof_artifact_host : string = "artifacts.logicalhacking.com"

3
etc/settings Normal file
View File

@ -0,0 +1,3 @@
# -*- shell-script -*- :mode=shellscript:
ISABELLE_DOF_HOME="$COMPONENT"

View File

@ -1,5 +1,5 @@
session "mini_odo" = "Isabelle_DOF" +
options [document = pdf, document_output = "output"]
options [document = pdf, document_output = "output", document_build = dof]
theories
"mini_odo"
document_files

View File

@ -1,5 +1,6 @@
session "2018-cicm-isabelle_dof-applications" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", quick_and_dirty = true]
options [document = pdf, document_output = "output", document_build = dof,
quick_and_dirty = true]
theories
IsaDofApplications
document_files

View File

@ -1,5 +1,5 @@
session "2020-iFM-csp" = "Isabelle_DOF" +
options [document = pdf, document_output = "output"]
options [document = pdf, document_output = "output", document_build = dof]
theories
"paper"
document_files

View File

@ -1,5 +1,6 @@
session "Isabelle_DOF-Manual" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", quick_and_dirty = true]
options [document = pdf, document_output = "output", document_build = dof,
quick_and_dirty = true]
theories
"Isabelle_DOF-Manual"
document_files

View File

@ -1,2 +1,2 @@
Isabelle_DOF-Manual
TR_my_commented_isabelle
#TR_my_commented_isabelle

View File

@ -1,5 +1,6 @@
session "TR_MyCommentedIsabelle" = "Isabelle_DOF" +
options [document = pdf, document_output = "output",quick_and_dirty = true]
options [document = pdf, document_output = "output", document_build = dof,
quick_and_dirty = true]
theories
"TR_MyCommentedIsabelle"
document_files

View File

@ -106,60 +106,32 @@ in
fun enriched_formal_statement_command ncid (S: (string * string) list) =
let fun transform_attr doc_attrs = (map (fn(cat,tag) => ((cat,@{here}),tag)) S) @
(("formal_results",@{here}),"([]::thm list)")::doc_attrs
in fn md => fn margs => fn thy =>
in fn margs => fn thy =>
gen_enriched_document_cmd {inline=true}
(transform_cid thy ncid) transform_attr md margs thy
(transform_cid thy ncid) transform_attr margs thy
end;
fun enriched_document_cmd_exp ncid (S: (string * string) list) =
(* expands ncid into supertype-check. *)
let fun transform_attr attrs = (map (fn(cat,tag) => ((cat,@{here}),tag)) S) @ attrs
in fn md => fn margs => fn thy =>
gen_enriched_document_cmd {inline=true} (transform_cid thy ncid) transform_attr md margs thy
in fn margs => fn thy =>
gen_enriched_document_cmd {inline=true} (transform_cid thy ncid) transform_attr margs thy
end;
end (* local *)
val _ =
Outer_Syntax.command ("title*", @{here}) "section heading"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (enriched_text_element_cmd NONE {markdown = false} ))) ;
fun heading_command (name, pos) descr level =
ODL_Command_Parser.document_command (name, pos) descr
{markdown = false, body = true} (enriched_text_element_cmd level);
val _ =
Outer_Syntax.command ("subtitle*", @{here}) "section heading"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (enriched_text_element_cmd NONE {markdown = false} )));
val _ =
Outer_Syntax.command ("chapter*", @{here}) "section heading"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (enriched_text_element_cmd (SOME(SOME 0)) {markdown = false} )));
val _ =
Outer_Syntax.command ("section*", @{here}) "section heading"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (enriched_text_element_cmd (SOME(SOME 1)) {markdown = false} )));
val _ =
Outer_Syntax.command ("subsection*", @{here}) "subsection heading"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (enriched_text_element_cmd (SOME(SOME 2)) {markdown = false} )));
val _ =
Outer_Syntax.command ("subsubsection*", @{here}) "subsubsection heading"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (enriched_text_element_cmd (SOME(SOME 3)) {markdown = false} )));
val _ =
Outer_Syntax.command ("paragraph*", @{here}) "paragraph heading"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (enriched_text_element_cmd (SOME(SOME 4)) {markdown = false} )));
val _ =
Outer_Syntax.command ("subparagraph*", @{here}) "subparagraph heading"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (enriched_text_element_cmd (SOME(SOME 5)) {markdown = false} )));
val _ = heading_command ("title*", @{here}) "section heading" NONE;
val _ = heading_command ("subtitle*", @{here}) "section heading" NONE;
val _ = heading_command ("chapter*", @{here}) "section heading" (SOME (SOME 0));
val _ = heading_command ("section*", @{here}) "section heading" (SOME (SOME 1));
val _ = heading_command ("subsection*", @{here}) "subsection heading" (SOME (SOME 2));
val _ = heading_command ("subsubsection*", @{here}) "subsubsection heading" (SOME (SOME 3));
val _ = heading_command ("paragraph*", @{here}) "paragraph heading" (SOME (SOME 4));
val _ = heading_command ("subparagraph*", @{here}) "subparagraph heading" (SOME (SOME 5));
end
end
@ -206,24 +178,13 @@ print_doc_classes
subsection\<open>Ontological Macros\<close>
ML\<open> local open ODL_Command_Parser in
ML\<open>
(* *********************************************************************** *)
(* Ontological Macro Command Support *)
(* *********************************************************************** *)
val _ =
Outer_Syntax.command ("figure*", @{here}) "figure"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (Onto_Macros.enriched_text_element_cmd NONE {markdown = false} )));
val _ =
Outer_Syntax.command ("side_by_side_figure*", @{here}) "multiple figures"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (Onto_Macros.enriched_text_element_cmd NONE {markdown = false} )));
end
val _ = Onto_Macros.heading_command ("figure*", @{here}) "figure" NONE;
val _ = Onto_Macros.heading_command ("side_by_side_figure*", @{here}) "multiple figures" NONE;
\<close>
(*<*)
@ -247,22 +208,22 @@ section\<open>Tables\<close>
ML\<open>
local
fun mk_line st1 st2 [a] = [a @ [Latex.string st2]]
|mk_line st1 st2 (a::S) = [a @ [Latex.string st1]] @ mk_line st1 st2 S;
fun mk_line st1 st2 [a] = [a @ Latex.string st2]
|mk_line st1 st2 (a::S) = [a @ Latex.string st1] @ mk_line st1 st2 S;
fun table_antiquotation name =
Thy_Output.antiquotation_raw_embedded name
Document_Output.antiquotation_raw_embedded name
(Scan.repeat1(Scan.repeat1(Scan.lift Args.cartouche_input)))
(fn ctxt =>
(fn content:Input.source list list =>
let fun check _ = () (* ToDo *)
val _ = check content
in content
|> (map(map (Thy_Output.output_document ctxt {markdown = false})
|> (map(map (Document_Output.output_document ctxt {markdown = false})
#> mk_line "&" "\\\\"
#> List.concat )
#> List.concat)
|> Latex.enclose_block "\\table[allerhandquatsch]{" "}"
|> XML.enclose "\\table[allerhandquatsch]{" "}"
end
)
);

View File

@ -75,9 +75,7 @@ val schemeN = "_scheme"
(* derived from: theory_markup *)
fun docref_markup_gen refN def name id pos =
if id = 0 then Markup.empty
else
Markup.properties (Position.entity_properties_of def id pos)
(Markup.entity refN name); (* or better store the thy-name as property ? ? ? *)
else Position.make_entity_markup {def = def} id refN (name, pos); (* or better store the thy-name as property ? ? ? *)
val docref_markup = docref_markup_gen docrefN
@ -1210,8 +1208,6 @@ fun document_command markdown (loc, txt) =
*)
ML\<open>
structure ODL_Command_Parser =
struct
@ -1221,7 +1217,6 @@ type meta_args_t = (((string * Position.T) *
(string * Position.T) option)
* ((string * Position.T) * string) list)
val semi = Scan.option (Parse.$$$ ";");
val is_improper = not o (Token.is_proper orf Token.is_begin_ignore orf Token.is_end_ignore);
val improper = Scan.many is_improper; (* parses white-space and comments *)
@ -1332,7 +1327,8 @@ fun calc_update_term thy cid_long (S:(string * Position.T * string * term)list)
let val cid_ty = cid_2_cidType cid_long thy
val generalize_term = Term.map_types (generalize_typ 0)
fun toString t = Syntax.string_of_term (Proof_Context.init_global thy) t
fun instantiate_term S t = Term_Subst.map_types_same (Term_Subst.instantiateT S) (t)
fun instantiate_term S t =
Term_Subst.map_types_same (Term_Subst.instantiateT (TVars.make S)) (t)
fun read_assn (lhs, pos:Position.T, opr, rhs) term =
let val info_opt = DOF_core.get_attribute_info cid_long (Long_Name.base_name lhs) thy
val (ln,lnt,lnu,lnut) = case info_opt of
@ -1560,74 +1556,6 @@ fun update_instance_command (((oid:string,pos),cid_pos),
end
(* old version :
fun gen_enriched_document_command {inline=is_inline} cid_transform attr_transform
markdown
(((((oid,pos),cid_pos), doc_attrs) : meta_args_t,
xstring_opt:(xstring * Position.T) option),
toks:Input.source)
: theory -> theory =
let val toplvl = Toplevel.theory_toplevel
fun check thy = let val ctxt = Toplevel.presentation_context (toplvl thy);
val pos = Input.pos_of toks;
val _ = Context_Position.reports ctxt
[(pos, Markup.language_document(Input.is_delimited toks)),
(pos, Markup.plain_text)];
in thy end
in
(* ... level-attribute information management *)
( create_and_check_docitem {is_monitor=false}
{is_inline=is_inline}
oid pos (cid_transform cid_pos)
(attr_transform doc_attrs)
(* checking and markup generation *)
#> check )
(* Thanks Frederic Tuong for the hints concerning toplevel ! ! ! *)
end;
*)
fun gen_enriched_document_cmd {inline=is_inline} cid_transform attr_transform
markdown
(((((oid,pos),cid_pos), doc_attrs) : meta_args_t,
xstring_opt:(xstring * Position.T) option),
toks:Input.source)
: theory -> theory =
let val toplvl = Toplevel.theory_toplevel
(* as side-effect, generates markup *)
fun check_n_tex_text thy = let val ctxt = Toplevel.presentation_context (toplvl thy);
val pos = Input.pos_of toks;
val _ = Context_Position.reports ctxt
[(pos, Markup.language_document (Input.is_delimited toks)),
(pos, Markup.plain_text)];
val text = Thy_Output.output_document
(Proof_Context.init_global thy)
markdown toks
(* val file = {path = Path.make [oid ^ "_snippet.tex"],
pos = @{here},
content = Latex.output_text text}
val _ = Generated_Files.write_file (Path.make ["latex_test"])
file
val _ = writeln (Latex.output_text text) *)
in thy end
(* ... generating the level-attribute syntax *)
in
(* ... level-attribute information management *)
( create_and_check_docitem
{is_monitor = false} {is_inline = is_inline}
oid pos (cid_transform cid_pos) (attr_transform doc_attrs)
(* checking and markup generation *)
#> check_n_tex_text )
(* Thanks Frederic Tuong for the hints concerning toplevel ! ! ! *)
end;
(* General criticism : attributes like "level" were treated here in the kernel instead of dragging
them out into the COL -- bu *)
@ -1677,8 +1605,193 @@ fun close_monitor_command (args as (((oid:string,pos),cid_pos),
|> delete_monitor_entry
end
(* Core Command Definitions *)
fun meta_args_2_latex thy ((((lab, _), cid_opt), attr_list) : meta_args_t) =
(* for the moment naive, i.e. without textual normalization of
attribute names and adapted term printing *)
let val l = "label = "^ (enclose "{" "}" lab)
(* val _ = writeln("meta_args_2_string lab:"^ lab ^":"^ (@{make_string } cid_opt) ) *)
val cid_long = case cid_opt of
NONE => (case DOF_core.get_object_global lab thy of
NONE => DOF_core.default_cid
| SOME X => #cid X)
| SOME(cid,_) => DOF_core.parse_cid_global thy cid
(* val _ = writeln("meta_args_2_string cid_long:"^ cid_long ) *)
val cid_txt = "type = " ^ (enclose "{" "}" cid_long);
fun ltx_of_term _ _ (Const ("List.list.Cons", @{typ "char \<Rightarrow> char list \<Rightarrow> char list"}) $ t1 $ t2)
= HOLogic.dest_string (Const ("List.list.Cons", @{typ "char \<Rightarrow> char list \<Rightarrow> char list"}) $ t1 $ t2)
| ltx_of_term _ _ (Const ("List.list.Nil", _)) = ""
| ltx_of_term _ _ (@{term "numeral :: _ \<Rightarrow> _"} $ t) = Int.toString(HOLogic.dest_numeral t)
| ltx_of_term ctx encl ((Const ("List.list.Cons", _) $ t1) $ t2) =
let val inner = (case t2 of
Const ("List.list.Nil", _) => (ltx_of_term ctx true t1)
| _ => ((ltx_of_term ctx false t1)^", " ^(ltx_of_term ctx false t2))
)
in if encl then enclose "{" "}" inner else inner end
| ltx_of_term _ _ (Const ("Option.option.None", _)) = ""
| ltx_of_term ctxt _ (Const ("Option.option.Some", _)$t) = ltx_of_term ctxt true t
| ltx_of_term ctxt _ t = ""^(Sledgehammer_Util.hackish_string_of_term ctxt t)
fun ltx_of_term_dbg ctx encl term = let
val t_str = ML_Syntax.print_term term
handle (TERM _) => "Exception TERM in ltx_of_term_dbg (print_term)"
val ltx = ltx_of_term ctx encl term
val _ = writeln("<STRING>"^(Sledgehammer_Util.hackish_string_of_term ctx term)^"</STRING>")
val _ = writeln("<LTX>"^ltx^"</LTX>")
val _ = writeln("<TERM>"^t_str^"</TERM>")
in ltx end
fun markup2string s = String.concat (List.filter (fn c => c <> Symbol.DEL)
(Symbol.explode (YXML.content_of s)))
fun ltx_of_markup ctxt s = let
val term = (Syntax.check_term ctxt o Syntax.parse_term ctxt) s
val str_of_term = ltx_of_term ctxt true term
handle _ => "Exception in ltx_of_term"
in
str_of_term
end
fun toLong n = #long_name(the(DOF_core.get_attribute_info cid_long (markup2string n) thy))
val ctxt = Proof_Context.init_global thy
val actual_args = map (fn ((lhs,_),rhs) => (toLong lhs, ltx_of_markup ctxt rhs))
attr_list
val default_args = map (fn (b,_,t) => (toLong (Long_Name.base_name ( Sign.full_name thy b)),
ltx_of_term ctxt true t))
(DOF_core.get_attribute_defaults cid_long thy)
val default_args_filtered = filter (fn (a,_) => not (exists (fn b => b = a)
(map (fn (c,_) => c) actual_args))) default_args
val str_args = map (fn (lhs,rhs) => lhs^" = "^(enclose "{" "}" rhs))
(actual_args@default_args_filtered)
val label_and_type = String.concat [ l, ",", cid_txt]
val str_args = label_and_type::str_args
in
Latex.string (enclose "[" "]" (String.concat [ label_and_type, ", args={", (commas str_args), "}"]))
end
(* level-attribute information management *)
fun gen_enriched_document_cmd {inline} cid_transform attr_transform
((((oid,pos),cid_pos), doc_attrs) : meta_args_t) : theory -> theory =
create_and_check_docitem {is_monitor = false} {is_inline = inline}
oid pos (cid_transform cid_pos) (attr_transform doc_attrs);
(* markup reports and document output *)
(* {markdown = true} sets the parsing process such that in the text-core
markdown elements are accepted. *)
fun document_output {markdown: bool, markup: Latex.text -> Latex.text} meta_args text ctxt =
let
val thy = Proof_Context.theory_of ctxt;
val _ = Context_Position.reports ctxt (Document_Output.document_reports text);
val output_meta = meta_args_2_latex thy meta_args;
val output_text = Document_Output.output_document ctxt {markdown = markdown} text;
in markup (output_meta @ output_text) end;
fun document_output_reports name {markdown, body} meta_args text ctxt =
let
val pos = Input.pos_of text;
val _ =
Context_Position.reports ctxt
[(pos, Markup.language_document (Input.is_delimited text)),
(pos, Markup.plain_text)];
fun markup xml =
let val m = if body then Markup.latex_body else Markup.latex_heading
in [XML.Elem (m (Latex.output_name name), xml)] end;
in document_output {markdown = markdown, markup = markup} meta_args text ctxt end;
(* document output commands *)
local
(* alternative presentation hook (workaround for missing Toplevel.present_theory) *)
structure Document_Commands = Theory_Data
(
type T = (string * (meta_args_t -> Input.source -> Proof.context -> Latex.text)) list;
val empty = [];
val merge = AList.merge (op =) (K true);
);
fun get_document_command thy name =
AList.lookup (op =) (Document_Commands.get thy) name;
fun document_segment (segment: Document_Output.segment) =
(case #span segment of
Command_Span.Span (Command_Span.Command_Span (name, _), _) =>
(case try Toplevel.theory_of (#state segment) of
SOME thy => get_document_command thy name
| _ => NONE)
| _ => NONE);
fun present_segment (segment: Document_Output.segment) =
(case document_segment segment of
SOME pr =>
let
val {span, command = tr, prev_state = st, state = st'} = segment;
val src = Command_Span.content (#span segment) |> filter_out Document_Source.is_improper;
val parse = attributes -- Parse.document_source;
fun present ctxt =
let val (meta_args, text) = #1 (Token.syntax (Scan.lift parse) src ctxt);
in pr meta_args text ctxt end;
val tr' =
Toplevel.empty
|> Toplevel.name (Toplevel.name_of tr)
|> Toplevel.position (Toplevel.pos_of tr)
|> Toplevel.present (Toplevel.presentation_context #> present);
val st'' = Toplevel.command_exception false tr' st'
handle Runtime.EXCURSION_FAIL (exn, _) => Exn.reraise exn;
val FIXME =
Toplevel.setmp_thread_position tr (fn () =>
writeln ("present_segment" ^ Position.here (Toplevel.pos_of tr) ^ "\n" ^
XML.string_of (XML.Elem (Markup.empty, the_default [] (Toplevel.output_of st'))) ^ "\n---\n" ^
XML.string_of (XML.Elem (Markup.empty, the_default [] (Toplevel.output_of st''))))) ()
in {span = span, command = tr, prev_state = st, state = st''} end
| _ => segment);
val _ =
Theory.setup (Thy_Info.add_presentation (fn {options, segments, ...} => fn thy =>
if exists (Toplevel.is_skipped_proof o #state) segments then ()
else
let
val segments' = map present_segment segments;
val body = Document_Output.present_thy options thy segments';
in
if Options.string options "document" = "false" orelse
forall (is_none o document_segment) segments' then ()
else
let
val thy_name = Context.theory_name thy;
val latex = Latex.isabelle_body thy_name body;
in Export.export thy \<^path_binding>\<open>document/latex_dof\<close> latex end
end));
in
fun document_command (name, pos) descr mark cmd =
(Outer_Syntax.command (name, pos) descr
(attributes -- Parse.document_source >>
(fn (meta_args, text) =>
Toplevel.theory (fn thy =>
let
val thy' = cmd meta_args thy;
val _ =
(case get_document_command thy' name of
SOME pr => ignore (pr meta_args text (Proof_Context.init_global thy'))
| NONE => ());
in thy' end)));
(Theory.setup o Document_Commands.map)
(AList.update (op =) (name, document_output_reports name mark)));
end;
(* Core Command Definitions *)
val _ =
Outer_Syntax.command @{command_keyword "open_monitor*"}
@ -1696,19 +1809,16 @@ val _ =
"update meta-attributes of an instance of a document class"
(attributes_upd >> (Toplevel.theory o update_instance_command));
val _ =
Outer_Syntax.command ("text*", @{here}) "formal comment (primary style)"
(attributes -- Parse.opt_target -- Parse.document_source
>> (Toplevel.theory o (gen_enriched_document_cmd {inline=true} I I {markdown = true} )));
document_command ("text*", @{here}) "formal comment (primary style)"
{markdown = true, body = true} (gen_enriched_document_cmd {inline=true} I I);
(* This is just a stub at present *)
val _ =
Outer_Syntax.command ("text-macro*", @{here}) "formal comment macro"
(attributes -- Parse.opt_target -- Parse.document_source
>> (Toplevel.theory o (gen_enriched_document_cmd {inline=false} (* declare as macro *)
I I {markdown = true} )));
document_command ("text-macro*", @{here}) "formal comment macro"
{markdown = true, body = true}
(gen_enriched_document_cmd {inline=false} (* declare as macro *) I I);
val _ =
Outer_Syntax.command @{command_keyword "declare_reference*"}
@ -1865,14 +1975,13 @@ val _ =
>> (fn (meta_args_opt, eval_args ) => pass_trans_to_value_cmd meta_args_opt eval_args));
val _ = Theory.setup
(Thy_Output.antiquotation_pretty_source_embedded \<^binding>\<open>value*\<close>
(Document_Output.antiquotation_pretty_source_embedded \<^binding>\<open>value*\<close>
(Scan.lift opt_evaluator -- Term_Style.parse -- Args.term)
(fn ctxt => fn ((name, style), t) =>
Thy_Output.pretty_term ctxt (style (value_select name ctxt t)))
#> add_evaluator (\<^binding>\<open>simp\<close>, Code_Simp.dynamic_value) #> snd
#> add_evaluator (\<^binding>\<open>nbe\<close>, Nbe.dynamic_value) #> snd
#> add_evaluator (\<^binding>\<open>code\<close>, Code_Evaluation.dynamic_value_strict)
#> snd);
(fn ctxt => fn ((name, style), t) =>
Document_Output.pretty_term ctxt (style (value_select name ctxt t)))
#> add_evaluator (\<^binding>\<open>simp\<close>, Code_Simp.dynamic_value) #> snd
#> add_evaluator (\<^binding>\<open>nbe\<close>, Nbe.dynamic_value) #> snd
#> add_evaluator (\<^binding>\<open>code\<close>, Code_Evaluation.dynamic_value_strict) #> snd);
end;
\<close>
@ -2003,7 +2112,7 @@ ML\<open>
structure OntoLinkParser =
struct
val basic_entity = Thy_Output.antiquotation_pretty_source
val basic_entity = Document_Output.antiquotation_pretty_source
: binding -> 'a context_parser -> (Proof.context -> 'a -> Pretty.T) -> theory -> theory;
fun check_and_mark ctxt cid_decl (str:{strict_checking: bool}) {inline=inline_req} pos name =
@ -2055,20 +2164,19 @@ fun pretty_docitem_antiquotation_generic cid_decl ctxt ({unchecked, define}, src
val inline = Config.get ctxt Document_Antiquotation.thy_output_display
val _ = check_and_mark ctxt cid_decl {strict_checking = not unchecked}
{inline = inline} pos str
val enc = Latex.enclose_block
in
(case (define,inline) of
(true,false) => enc("\\csname isaDof.label\\endcsname[type={"^cid_decl^"}] {")"}"
|(false,false)=> enc("\\csname isaDof.ref\\endcsname[type={"^cid_decl^"}] {")"}"
|(true,true) => enc("\\csname isaDof.macroDef\\endcsname[type={"^cid_decl^"}]{")"}"
|(false,true) => enc("\\csname isaDof.macroExp\\endcsname[type={"^cid_decl^"}]{")"}"
(true,false) => XML.enclose("\\csname isaDof.label\\endcsname[type={"^cid_decl^"}] {")"}"
|(false,false)=> XML.enclose("\\csname isaDof.ref\\endcsname[type={"^cid_decl^"}] {")"}"
|(true,true) => XML.enclose("\\csname isaDof.macroDef\\endcsname[type={"^cid_decl^"}]{")"}"
|(false,true) => XML.enclose("\\csname isaDof.macroExp\\endcsname[type={"^cid_decl^"}]{")"}"
)
[Latex.text (Input.source_content src)]
(Latex.text (Input.source_content src))
end
fun docitem_antiquotation bind cid =
Thy_Output.antiquotation_raw bind docitem_antiquotation_parser
Document_Output.antiquotation_raw bind docitem_antiquotation_parser
(pretty_docitem_antiquotation_generic cid);
@ -2107,7 +2215,7 @@ ML\<open>
structure AttributeAccess =
struct
val basic_entity = Thy_Output.antiquotation_pretty_source
val basic_entity = Document_Output.antiquotation_pretty_source
: binding -> 'a context_parser -> (Proof.context -> 'a -> Pretty.T) -> theory -> theory;
fun symbex_attr_access0 ctxt proj_term term =
@ -2195,13 +2303,13 @@ fun compute_cid_repr ctxt cid pos =
local
fun pretty_attr_access_style ctxt (style, ((oid,pos),(attr,pos'))) =
Thy_Output.pretty_term ctxt (style (compute_attr_access (Context.Proof ctxt)
Document_Output.pretty_term ctxt (style (compute_attr_access (Context.Proof ctxt)
attr oid pos pos'));
fun pretty_trace_style ctxt (style, (oid,pos)) =
Thy_Output.pretty_term ctxt (style (compute_attr_access (Context.Proof ctxt)
Document_Output.pretty_term ctxt (style (compute_attr_access (Context.Proof ctxt)
"trace" oid pos pos));
fun pretty_cid_style ctxt (style, (cid,pos)) =
Thy_Output.pretty_term ctxt (style (compute_cid_repr ctxt cid pos));
Document_Output.pretty_term ctxt (style (compute_cid_repr ctxt cid pos));
in
val _ = Theory.setup
@ -2221,7 +2329,7 @@ end
text\<open> Note that the functions \<^verbatim>\<open>basic_entities\<close> and \<^verbatim>\<open>basic_entity\<close> in
@{ML_structure AttributeAccess} are copied from
@{file "$ISABELLE_HOME/src/Pure/Thy/thy_output.ML"} \<close>
@{file "$ISABELLE_HOME/src/Pure/Thy/document_output.ML"} \<close>
section\<open> Syntax for Ontologies (the '' View'' Part III) \<close>
@ -2262,7 +2370,15 @@ fun read_fields raw_fields ctxt =
val trace_attr = ((Binding.make("trace",@{here}), "(doc_class rexp \<times> string) list",Mixfix.NoSyn),
SOME "[]"): ((binding * string * mixfix) * string option)
fun def_cmd (decl, spec, prems, params) = #2 oo Specification.definition' decl params prems spec
fun def_cmd (decl, spec, prems, params) lthy =
let
val ((lhs as Free (x, T), _), lthy') = Specification.definition decl params prems spec lthy;
val lhs' = Morphism.term (Local_Theory.target_morphism lthy') lhs;
val _ =
Proof_Display.print_consts true (Position.thread_data ()) lthy'
(Frees.defined (Frees.build (Frees.add_frees lhs'))) [(x, T)]
in lthy' end
fun meta_eq_const T = Const (\<^const_name>\<open>Pure.eq\<close>, T --> T --> propT);
fun mk_meta_eq (t, u) = meta_eq_const (fastype_of t) $ t $ u;
@ -2270,7 +2386,7 @@ fun define_cond binding f_sty cond_suffix read_cond (ctxt:local_theory) =
let val bdg = Binding.suffix_name cond_suffix binding
val eq = mk_meta_eq(Free(Binding.name_of bdg, f_sty),read_cond)
val args = (SOME(bdg,NONE,NoSyn), (Binding.empty_atts,eq),[],[])
in def_cmd args true ctxt end
in def_cmd args ctxt end
fun define_inv cid_long ((lbl, pos), inv) thy =
let val bdg = Binding.make (lbl,pos)
@ -2433,34 +2549,34 @@ ML\<open>
structure DOF_lib =
struct
fun define_shortcut name latexshcut =
Thy_Output.antiquotation_raw name (Scan.succeed ())
Document_Output.antiquotation_raw name (Scan.succeed ())
(fn _ => fn () => Latex.string latexshcut)
(* This is a generalization of the Isabelle2020 function "control_antiquotation" from
document_antiquotations.ML. (Thanks Makarius!) *)
fun define_macro name s1 s2 reportNtest =
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.cartouche_input)
Document_Output.antiquotation_raw_embedded name (Scan.lift Args.cartouche_input)
(fn ctxt =>
fn src => let val () = reportNtest ctxt src
in src |> Latex.enclose_block s1 s2
o Thy_Output.output_document ctxt {markdown = false}
in src |> XML.enclose s1 s2
o Document_Output.output_document ctxt {markdown = false}
end);
local (* hide away really strange local construction *)
fun enclose_body2 front body1 middle body2 post =
(if front = "" then [] else [Latex.string front]) @ body1 @
(if middle = "" then [] else [Latex.string middle]) @ body2 @
(if post = "" then [] else [Latex.string post]);
(if front = "" then [] else Latex.string front) @ body1 @
(if middle = "" then [] else Latex.string middle) @ body2 @
(if post = "" then [] else Latex.string post);
in
fun define_macro2 name front middle post reportNtest1 reportNtest2 =
Thy_Output.antiquotation_raw_embedded name (Scan.lift ( Args.cartouche_input
Document_Output.antiquotation_raw_embedded name (Scan.lift ( Args.cartouche_input
-- Args.cartouche_input))
(fn ctxt =>
fn (src1,src2) => let val () = reportNtest1 ctxt src1
val () = reportNtest2 ctxt src2
val T1 = Thy_Output.output_document ctxt {markdown = false} src1
val T2 = Thy_Output.output_document ctxt {markdown = false} src2
in Latex.block(enclose_body2 front T1 middle T2 post)
val T1 = Document_Output.output_document ctxt {markdown = false} src1
val T2 = Document_Output.output_document ctxt {markdown = false} src2
in enclose_body2 front T1 middle T2 post
end);
end
@ -2488,8 +2604,8 @@ fun prepare_text ctxt =
fun string_2_text_antiquotation ctxt text =
prepare_text ctxt text
|> Thy_Output.output_source ctxt
|> Thy_Output.isabelle ctxt
|> Document_Output.output_source ctxt
|> Document_Output.isabelle ctxt
fun string_2_theory_text_antiquotation ctxt text =
let
@ -2497,12 +2613,12 @@ fun string_2_theory_text_antiquotation ctxt text =
in
prepare_text ctxt text
|> Token.explode0 keywords
|> maps (Thy_Output.output_token ctxt)
|> Thy_Output.isabelle ctxt
|> maps (Document_Output.output_token ctxt)
|> Document_Output.isabelle ctxt
end
fun gen_text_antiquotation name reportNcheck compile =
Thy_Output.antiquotation_raw_embedded name (Scan.lift Args.text_input)
Document_Output.antiquotation_raw_embedded name (Scan.lift Args.text_input)
(fn ctxt => fn text:Input.source =>
let
val _ = reportNcheck ctxt text;
@ -2556,15 +2672,15 @@ fun environment_delim name =
("%\n\\begin{" ^ Latex.output_name name ^ "}\n",
"\n\\end{" ^ Latex.output_name name ^ "}");
fun environment_block name = environment_delim name |-> Latex.enclose_body #> Latex.block;
fun environment_block name = environment_delim name |-> XML.enclose;
fun enclose_env verbatim ctxt block_env body =
if Config.get ctxt Document_Antiquotation.thy_output_display
then if verbatim
then environment_block block_env [body]
else Latex.environment_block block_env [body]
else Latex.block ([Latex.string ("\\inline"^block_env ^"{")] @ [body] @ [ Latex.string ("}")]);
then environment_block block_env body
else Latex.environment block_env body
else XML.enclose ("\\inline"^block_env ^"{") "}" body;
end
\<close>

View File

@ -1,5 +1,5 @@
session "Isabelle_DOF" = "Functional-Automata" +
options [document = pdf, document_output = "output"]
options [document = pdf, document_output = "output", document_build = dof]
sessions
"Regular-Sets"
directories

View File

@ -47,23 +47,15 @@ doc_class abstract =
ML\<open>
local open ODL_Command_Parser in
val _ = Outer_Syntax.command ("abstract*", @{here}) "Textual Definition"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (Onto_Macros.enriched_document_cmd_exp
(SOME "abstract")
[]
{markdown = true} )));
val _ =
ODL_Command_Parser.document_command ("abstract*", @{here}) "Textual Definition"
{markdown = true, body = true}
(Onto_Macros.enriched_document_cmd_exp (SOME "abstract") []);
val _ = Outer_Syntax.command ("author*", @{here}) "Textual Definition"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (Onto_Macros.enriched_document_cmd_exp
(SOME "author")
[]
{markdown = true} )));
end
val _ =
ODL_Command_Parser.document_command ("author*", @{here}) "Textual Definition"
{markdown = true, body = true}
(Onto_Macros.enriched_document_cmd_exp (SOME "author") []);
\<close>
text\<open>Scholarly Paper is oriented towards the classical domains in science:
@ -298,45 +290,41 @@ setup\<open>Theorem_default_class_setup\<close>
ML\<open> local open ODL_Command_Parser in
(* {markdown = true} sets the parsing process such that in the text-core
markdown elements are accepted. *)
val _ =
ODL_Command_Parser.document_command ("Definition*", @{here}) "Textual Definition"
{markdown = true, body = true}
(fn meta_args => fn thy =>
let
val ddc = Config.get_global thy Definition_default_class
val use_Definition_default = SOME(((ddc = "") ? (K "math_content")) ddc)
in
Onto_Macros.enriched_formal_statement_command
use_Definition_default [("mcc","defn")] meta_args thy
end);
val _ = let fun use_Definition_default thy =
let val ddc = Config.get_global thy Definition_default_class
in (SOME(((ddc = "") ? (K "math_content")) ddc)) end
in Outer_Syntax.command ("Definition*", @{here}) "Textual Definition"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (fn args => fn thy =>
Onto_Macros.enriched_formal_statement_command
(use_Definition_default thy)
[("mcc","defn")]
{markdown = true} args thy)))
end;
val _ =
ODL_Command_Parser.document_command ("Lemma*", @{here}) "Textual Lemma Outline"
{markdown = true, body = true}
(fn meta_args => fn thy =>
let
val ddc = Config.get_global thy Definition_default_class
val use_Lemma_default = SOME(((ddc = "") ? (K "math_content")) ddc)
in
Onto_Macros.enriched_formal_statement_command
use_Lemma_default [("mcc","lem")] meta_args thy
end);
val _ = let fun use_Lemma_default thy =
let val ddc = Config.get_global thy Definition_default_class
in (SOME(((ddc = "") ? (K "math_content")) ddc)) end
in Outer_Syntax.command ("Lemma*", @{here}) "Textual Lemma Outline"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (fn args => fn thy =>
Onto_Macros.enriched_formal_statement_command
(use_Lemma_default thy)
[("mcc","lem")]
{markdown = true} args thy)))
end;
val _ = let fun use_Theorem_default thy =
let val ddc = Config.get_global thy Definition_default_class
in (SOME(((ddc = "") ? (K "math_content")) ddc)) end
in Outer_Syntax.command ("Theorem*", @{here}) "Textual Theorem Outline"
(attributes -- Parse.opt_target -- Parse.document_source --| semi
>> (Toplevel.theory o (fn args => fn thy =>
Onto_Macros.enriched_formal_statement_command
(use_Theorem_default thy)
[("mcc","thm")]
{markdown = true} args thy)))
end;
val _ =
ODL_Command_Parser.document_command ("Theorem*", @{here}) "Textual Theorem Outline"
{markdown = true, body = true}
(fn meta_args => fn thy =>
let
val ddc = Config.get_global thy Definition_default_class
val use_Theorem_default = SOME(((ddc = "") ? (K "math_content")) ddc)
in
Onto_Macros.enriched_formal_statement_command
use_Theorem_default [("mcc","thm")] meta_args thy
end);
end
\<close>

View File

@ -0,0 +1,52 @@
/* Author: Makarius
Build theory document (PDF) from session database.
*/
package isabelle_dof
import isabelle._
object DOF_Document_Build
{
class Engine extends Document_Build.Bash_Engine("dof")
{
override def use_build_script: Boolean = true
override def prepare_directory(
context: Document_Build.Context,
dir: Path,
doc: Document_Build.Document_Variant): Document_Build.Directory =
{
val latex_output = new Latex_Output(context.options)
val directory = context.prepare_directory(dir, doc, latex_output)
// produced by alternative presentation hook (workaround for missing Toplevel.present_theory)
for (name <- context.document_theories) {
val path = Path.basic(Document_Build.tex_name(name))
val xml =
YXML.parse_body(context.get_export(name.theory, Export.DOCUMENT_LATEX + "_dof").text)
if (xml.nonEmpty) {
File.Content(path, xml).output(latex_output(_, file_pos = path.implode_symbolic))
.write(directory.doc_dir)
}
}
directory
}
}
class Latex_Output(options: Options) extends Latex.Output(options)
{
override def latex_environment(
name: String,
body: Latex.Text,
optional_argument: String = ""): Latex.Text =
{
XML.enclose(
"\n\\begin{" + name + "}" + optional_argument + "\n",
"\n\\end{" + name + "}", body)
}
}
}

View File

@ -0,0 +1,69 @@
/* Author: Makarius
Prepare session root directory for Isabelle/DOF.
*/
package isabelle_dof
import isabelle._
object DOF_Mkroot
{
/** mkroot **/
def mkroot(
session_name: String = "",
session_dir: Path = Path.current,
session_parent: String = "",
init_repos: Boolean = false,
title: String = "",
author: String = "",
progress: Progress = new Progress): Unit =
{
Mkroot.mkroot(session_name = session_name, session_dir = session_dir,
session_parent = session_parent, init_repos = init_repos,
title = title, author = author, progress = progress)
}
/** Isabelle tool wrapper **/
val isabelle_tool = Isabelle_Tool("dof_mkroot", "prepare session root directory for Isabelle/DOF",
Scala_Project.here, args =>
{
var author = ""
var init_repos = false
var title = ""
var session_name = ""
val getopts = Getopts("""
Usage: isabelle dof_mkroot [OPTIONS] [DIRECTORY]
Options are:
-A LATEX provide author in LaTeX notation (default: user name)
-I init Mercurial repository and add generated files
-T LATEX provide title in LaTeX notation (default: session name)
-n NAME alternative session name (default: directory base name)
Prepare session root directory (default: current directory).
""",
"A:" -> (arg => author = arg),
"I" -> (arg => init_repos = true),
"T:" -> (arg => title = arg),
"n:" -> (arg => session_name = arg))
val more_args = getopts(args)
val session_dir =
more_args match {
case Nil => Path.current
case List(dir) => Path.explode(dir)
case _ => getopts.usage()
}
mkroot(session_name = session_name, session_dir = session_dir, init_repos = init_repos,
author = author, title = title, progress = new Console_Progress)
})
}

13
src/scala/dof_tools.scala Normal file
View File

@ -0,0 +1,13 @@
/* Author: Makarius
Isabelle/DOF command-line tools.
*/
package isabelle_dof
import isabelle._
class DOF_Tools extends Isabelle_Scala_Tools(
DOF_Mkroot.isabelle_tool
)

View File

@ -126,12 +126,11 @@ rm -f *.aux
sed -i -e 's/<@>/@/g' *.tex
$ISABELLE_TOOL latex -o sty "root.tex" && \
$ISABELLE_TOOL latex -o "$OUTFORMAT" "root.tex" && \
{ [ ! -f "$ROOT_NAME.bib" ] || $ISABELLE_TOOL latex -o bbl "root.tex"; } && \
{ [ ! -f "$ROOT_NAME.idx" ] || $ISABELLE_TOOL latex -o idx "root.tex"; } && \
$ISABELLE_TOOL latex -o "$OUTFORMAT" "root.tex" && \
$ISABELLE_TOOL latex -o "$OUTFORMAT" "root.tex"
$ISABELLE_PDFLATEX root && \
{ [ ! -f "$ROOT_NAME.bib" ] || $ISABELLE_BIBTEX root; } && \
{ [ ! -f "$ROOT_NAME.idx" ] || $ISABELLE_MAKEINDEX root; } && \
$ISABELLE_PDFLATEX root && \
$ISABELLE_PDFLATEX root
MISSING_CITATIONS=`grep 'Warning.*Citation' root.log | awk '{print $5}' | sort -u`
if [ "$MISSING_CITATIONS" != "" ]; then