FINISHED MY PASS ON THE PROGRAMMING MANUAL.

This commit is contained in:
Burkhart Wolff 2020-10-25 12:02:44 +01:00
parent 7999ee9a38
commit 3c6197e6ca
1 changed files with 181 additions and 199 deletions

View File

@ -1949,7 +1949,8 @@ text\<open>The heart of the parsers for mathematical notation, based on an Earle
text\<open> Note that the naming underlies the following convention :
there are:
\<^enum> "parser"s and type-"checker"s
\<^enum> "parser"s
\<^enum> type-"checker"s, which usually also englobe the markup generation for PIDE
\<^enum> "reader"s which do both together with pretty-printing
This is encapsulated the data structure @{ML_structure Syntax} ---
@ -1987,8 +1988,8 @@ text\<open>
\<^item> \<^ML>\<open>Syntax.read_prop_global: theory -> string -> term\<close>
\<close>
text \<open>The following operations are concerned with the conversion of pretty-prints
and, from there, the generation of (non-layouted) strings.\<close>
text\<open>
and, from there, the generation of (non-layouted) strings :
\<^item> \<^ML>\<open>Syntax.pretty_term:Proof.context -> term -> Pretty.T\<close>
\<^item> \<^ML>\<open>Syntax.pretty_typ:Proof.context -> typ -> Pretty.T \<close>
\<^item> \<^ML>\<open>Syntax.pretty_sort:Proof.context -> sort -> Pretty.T \<close>
@ -2036,12 +2037,6 @@ text\<open>
subsection*[ex33::example] \<open>Example\<close>
ML\<open>
(* Recall the Arg-interface to the more high-level, more Isar-specific parsers: *)
Args.name : string parser ;
Args.const : {proper: bool, strict: bool} -> string context_parser;
Args.cartouche_input: Input.source parser;
Args.text_token : Token.T parser;
(* here follows the definition of the attribute parser : *)
val Z = let val attribute = Parse.position Parse.name --
@ -2051,129 +2046,117 @@ val Z = let val attribute = Parse.position Parse.name --
(* Here is the code to register the above parsers as text antiquotations into the Isabelle
Framework: *)
Thy_Output.antiquotation_pretty_source \<^binding>\<open>theory\<close> (Scan.lift (Parse.position Args.embedded));
Thy_Output.antiquotation_pretty_source \<^binding>\<open>theory\<close>
(Scan.lift (Parse.position Args.embedded));
Thy_Output.antiquotation_raw \<^binding>\<open>file\<close> (Scan.lift (Parse.position Parse.path)) ;
Thy_Output.antiquotation_raw \<^binding>\<open>file\<close>
(Scan.lift (Parse.position Parse.path)) ;
\<close>
(* where we have the registration of the action
(Scan.lift (Parse.position Args.cartouche_input))))
to be bound to the
text\<open>where we have the registration of the action
\<^ML>\<open>Scan.lift (Parse.position Args.cartouche_input)\<close>
to be bound to the \<open>name\<close> as a whole is a system
transaction that, of course, has the type \<^ML_type>\<open>theory -> theory\<close> :
@{ML [display] \<open>
(fn name => (Thy_Output.antiquotation_pretty_source
name
as a whole is a system transaction that, of course, has the type
theory -> theory : *)
(fn name => (Thy_Output.antiquotation_pretty_source name
(Scan.lift (Parse.position Args.cartouche_input))))
: binding -> (Proof.context -> Input.source * Position.T -> Pretty.T) -> theory -> theory;
: binding ->
(Proof.context -> Input.source * Position.T -> Pretty.T) ->
theory -> theory
\<close>}
\<close>
section \<open> Output: Very Low Level \<close>
text\<open> For re-directing the output channels, the structure @{ML_structure Output} may be relevant:\<close>
ML\<open>
Output.output; (* output is the structure for the "hooks" with the target devices. *)
Output.output "bla_1:";
text\<open> For re-directing the output channels, the structure \<^ML_structure>\<open>Output\<close> may be relevant:
\<^ML>\<open>Output.output:string -> string\<close> is the structure for the "hooks" with the target devices.
\<close>
ML\<open> Output.output "bla_1:" \<close>
text\<open>It provides a number of hooks that can be used for redirection hacks ...\<close>
section \<open> Output: LaTeX \<close>
text\<open>The heart of the LaTeX generator is to be found in the structure \<^ML_structure>\<open>Thy_Output\<close>.
This is an own parsing and writing process, with the risc that a parsed file in the IDE parsing
process can not be parsed for the LaTeX Generator. The reason is twofold:
\<^enum> The LaTeX Generator makes a rough attempt to mimic the LayOut if the thy-file; thus, its
spacing is relevant.
\<^enum> there is a special bracket \<open>(*<*)\<close> ... \<open>(*>*)\<close> that allows to specify input that is checked by
Isabelle, but excluded from the LaTeX generator (this is handled in an own sub-parser
called @{ML "Document_Source.improper"} where also other forms of comment parsers are provided.
called \<^ML>\<open>Document_Source.improper\<close> where also other forms of comment parsers are provided.
Since Isabelle2018, an own AST is provided for the LaTeX syntax, analogously to
\<^ML_structure>\<open>Pretty\<close>. Key functions of this structure \<^ML_structure>\<open>Latex\<close> are:
\<^item>\<^ML>\<open>Latex.string: string -> Latex.text\<close>
\<^item>\<^ML>\<open>Latex.text: string * Position.T -> Latex.text\<close>
\<^item>\<^ML>\<open>Latex.output_text: Latex.text list -> string\<close>
\<^item>\<^ML>\<open>Latex.output_positions: Position.T -> Latex.text list -> string\<close>
\<^item>\<^ML>\<open>Latex.output_name: string -> string\<close>
\<^item>\<^ML>\<open>Latex.output_ascii: string -> string\<close>
\<^item>\<^ML>\<open>Latex.output_symbols: Symbol.symbol list -> string\<close>
\<^item>\<^ML>\<open>Latex.begin_delim: string -> string\<close>
\<^item>\<^ML>\<open>Latex.end_delim: string -> string\<close>
\<^item>\<^ML>\<open>Latex.begin_tag: string -> string\<close>
\<^item>\<^ML>\<open>Latex.end_tag: string -> string\<close>
\<^item>\<^ML>\<open>Latex.environment_block: string -> Latex.text list -> Latex.text\<close>
\<^item>\<^ML>\<open>Latex.environment: string -> string -> string\<close>
\<^item>\<^ML>\<open>Latex.block: Latex.text list -> Latex.text\<close>
\<^item>\<^ML>\<open>Latex.enclose_body: string -> string -> Latex.text list -> Latex.text list\<close>
\<^item>\<^ML>\<open>Latex.enclose_block: string -> string -> Latex.text list -> Latex.text\<close>
\<close>
ML\<open>
local
open Latex
type dummy = text
val _ = string: string -> text;
val _ = text: string * Position.T -> text
val _ = output_text: text list -> string
val _ = output_positions: Position.T -> text list -> string
val _ = output_name: string -> string
val _ = output_ascii: string -> string
val _ = output_symbols: Symbol.symbol list -> string
val _ = begin_delim: string -> string
val _ = end_delim: string -> string
val _ = begin_tag: string -> string
val _ = end_tag: string -> string
val _ = environment_block: string -> text list -> text
val _ = environment: string -> string -> string
val _ = block: text list -> text
val _ = enclose_body: string -> string -> text list -> text list
val _ = enclose_block: string -> string -> text list -> text
in val _ = ()
end;
Latex.output_ascii;
ML\<open> Latex.output_ascii;
Latex.environment "isa" "bg";
Latex.output_ascii "a_b:c'é";
(* Note: *)
space_implode "sd &e sf dfg" ["qs","er","alpa"];
\<close>
text\<open>Here is an abstract of the main interface to @{ML_structure Thy_Output}:\<close>
ML\<open>
Thy_Output.output_document: Proof.context -> {markdown: bool} -> Input.source -> Latex.text list;
Thy_Output.output_token: Proof.context -> Token.T -> Latex.text list;
Thy_Output.output_source: Proof.context -> string -> Latex.text list;
Thy_Output.present_thy: Options.T -> theory -> Thy_Output.segment list -> Latex.text list;
text\<open>
\<^item>\<^ML>\<open>Thy_Output.output_document: Proof.context -> {markdown: bool} -> Input.source -> Latex.text list\<close>
\<^item>\<^ML>\<open>Thy_Output.output_token: Proof.context -> Token.T -> Latex.text list\<close>
\<^item>\<^ML>\<open>Thy_Output.output_source: Proof.context -> string -> Latex.text list\<close>
\<^item>\<^ML>\<open>Thy_Output.present_thy: Options.T -> theory -> Thy_Output.segment list -> Latex.text list\<close>
Thy_Output.isabelle: Proof.context -> Latex.text list -> Latex.text;
\<^item>\<^ML>\<open>Thy_Output.isabelle: Proof.context -> Latex.text list -> Latex.text\<close>
\<^item>\<^ML>\<open>Thy_Output.isabelle_typewriter: Proof.context -> Latex.text list -> Latex.text\<close>
\<^item>\<^ML>\<open>Thy_Output.typewriter: Proof.context -> string -> Latex.text\<close>
\<^item>\<^ML>\<open>Thy_Output.verbatim: Proof.context -> string -> Latex.text\<close>
\<^item>\<^ML>\<open>Thy_Output.source: Proof.context -> {embedded: bool} -> Token.src -> Latex.text\<close>
\<^item>\<^ML>\<open>Thy_Output.pretty: Proof.context -> Pretty.T -> Latex.text\<close>
\<^item>\<^ML>\<open>Thy_Output.pretty_source: Proof.context -> {embedded: bool} -> Token.src -> Pretty.T -> Latex.text\<close>
\<^item>\<^ML>\<open>Thy_Output.pretty_items: Proof.context -> Pretty.T list -> Latex.text\<close>
\<^item>\<^ML>\<open>Thy_Output.pretty_items_source: Proof.context -> {embedded: bool} -> Token.src -> Pretty.T list -> Latex.text\<close>
Thy_Output.isabelle_typewriter: Proof.context -> Latex.text list -> Latex.text;
Thy_Output.typewriter: Proof.context -> string -> Latex.text;
Thy_Output.verbatim: Proof.context -> string -> Latex.text;
Thy_Output.source: Proof.context -> {embedded: bool} -> Token.src -> Latex.text;
Thy_Output.pretty: Proof.context -> Pretty.T -> Latex.text;
Thy_Output.pretty_source: Proof.context -> {embedded: bool} -> Token.src -> Pretty.T -> Latex.text;
Thy_Output.pretty_items: Proof.context -> Pretty.T list -> Latex.text;
Thy_Output.pretty_items_source: Proof.context -> {embedded: bool} -> Token.src -> Pretty.T list -> Latex.text;
(* finally a number of antiquotation registries : *)
Thy_Output.antiquotation_pretty:
binding -> 'a context_parser -> (Proof.context -> 'a -> Pretty.T) -> theory -> theory;
Thy_Output.antiquotation_pretty_source:
binding -> 'a context_parser -> (Proof.context -> 'a -> Pretty.T) -> theory -> theory;
Thy_Output.antiquotation_raw:
binding -> 'a context_parser -> (Proof.context -> 'a -> Latex.text) -> theory -> theory;
Thy_Output.antiquotation_verbatim:
binding -> 'a context_parser -> (Proof.context -> 'a -> string) -> theory -> theory;
Finally a number of antiquotation registries :
\<^item>\<^ML>\<open>Thy_Output.antiquotation_pretty:
binding -> 'a context_parser -> (Proof.context -> 'a -> Pretty.T) -> theory -> theory\<close>
\<^item>\<^ML>\<open>Thy_Output.antiquotation_pretty_source:
binding -> 'a context_parser -> (Proof.context -> 'a -> Pretty.T) -> theory -> theory\<close>
\<^item>\<^ML>\<open>Thy_Output.antiquotation_raw:
binding -> 'a context_parser -> (Proof.context -> 'a -> Latex.text) -> theory -> theory\<close>
\<^item>\<^ML>\<open>Thy_Output.antiquotation_verbatim:
binding -> 'a context_parser -> (Proof.context -> 'a -> string) -> theory -> theory\<close>
\<close>
datatype qsd = zef
text\<open> Thus, @{ML_structure Syntax_Phases} does the actual work of markup generation, including
markup generation and generation of reports. Look at the following snippet: \<close>
ML\<open>
(*
text\<open> Thus, \<^ML_structure>\<open>Syntax_Phases\<close> does the actual work of markup generation, including
markup generation and generation of reports. Look at the following snippet:
\<open>
fun check_typs ctxt raw_tys =
let
val (sorting_report, tys) = Proof_Context.prepare_sortsT ctxt raw_tys;
@ -2183,11 +2166,11 @@ fun check_typs ctxt raw_tys =
|> apply_typ_check ctxt
|> Term_Sharing.typs (Proof_Context.theory_of ctxt)
end;
\<close>
which is the real implementation behind Syntax.check_typ
or:
which is the real implementation behind \<^ML>\<open>Syntax.check_typ\<close>
\<open>
fun check_terms ctxt raw_ts =
let
val (sorting_report, raw_ts') = Proof_Context.prepare_sorts ctxt raw_ts;
@ -2208,12 +2191,10 @@ fun check_terms ctxt raw_ts =
if Context_Position.is_visible ctxt then Output.report (sorting_report @ typing_report)
else ();
in Term_Sharing.terms (Proof_Context.theory_of ctxt) ts' end;
\<close>
which is the real implementation behind Syntax.check_term
As one can see, check-routines internally generate the markup.
*)
which is the real implementation behind \<^ML>\<open>Syntax.check_term\<close>. As one can see, check-routines
internally generate the markup.
\<close>
@ -2291,15 +2272,16 @@ fun parse_translation_cartouche binding l f_integer accu =
\<close>
text\<open>The following registration of this cartouche for strings is fails because it has
already been done in the surrounding Isabelle/DOF environment... \<close>
(*
already been done in the surrounding Isabelle/DOF environment...
\<^verbatim>\<open>
parse_translation \<open>
[( @{syntax_const "_cartouche_string"}
, parse_translation_cartouche \<^binding>\<open>cartouche_type\<close> Cartouche_Grammar.default (K I) ())]
\<close>
*)
\<close>
\<close>
(* test for this cartouche... *)
text\<open> Test for this cartouche... \<close>
term "\<open>A \<Longrightarrow> B\<close> = ''''"
@ -2310,9 +2292,9 @@ text\<open> This interactive Isabelle Programming Cook-Book represents my curren
It differs from the reference manual in some places on purpose, since I believe
that a lot of internal Isabelle API's need a more conceptual view on what is happening
(even if this conceptual view is at times over-abstracting a little).
It is written in Isabelle/DOF and conceived as "living document" (a term that
I owe Simon Foster), i.e. as hypertext-heavy text making direct references to
the Isabelle API's which were checked whenever this document is re-visited in Isabelle/jEdit.
It is written in Isabelle/DOF and conceived as "living document" (a term that I owe to
Simon Foster), i.e. as hypertext-heavy text making direct references to the Isabelle API's
which were checked whenever this document is re-visited in Isabelle/jEdit.
All hints and contributions of collegues and collaborators are greatly welcomed; all errors
and the roughness of this presentation is entirely my fault.