Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF
Some checks are pending
ci/woodpecker/push/build Pipeline is pending

This commit is contained in:
Burkhart Wolff 2025-03-31 19:34:46 +02:00
commit fc3bb91d1a
22 changed files with 240 additions and 214 deletions

View File

@ -1,51 +0,0 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).
## [Unreleased]
### Added
### Changed
- Updated Isabelle version to Isabelle 2023
## [1.3.0] - 2022-07-08
### Changed
- The project-specific configuration is not part of the `ROOT` file, the formerly
used `isadof.cfg` is obsolete and no longer supported.
- Removed explicit use of `document/build` script. Requires removing the `build` script
entry from ROOT files.
- Isabelle/DOF is now a proper Isabelle component that should be installed using the
`isabelle components` command. The installation script is now only a convenient way
of installing the required AFP entries.
- `mkroot_DOF` has been renamed to `dof_mkroot` (and reimplemented in Scala).
## [1.2.0] - 2022-03-26
## [1.1.0] - 2021-03-20
### Added
- New antiquotations, consistency checks
### Changed
- Updated manual
- Restructured setup for ontologies (Isabelle theories and LaTeX styles)
## 1.0.0 - 2018-08-18
### Added
- First public release
[Unreleased]: https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/compare/v1.3.0/Isabelle2021...HEAD
[1.3.0]: https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/compare/v1.2.0/Isabelle2021...v1.3.0/Isabelle2021-1
[1.2.0]: https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/compare/v1.1.0/Isabelle2021...v1.2.0/Isabelle2021
[1.1.0]: https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/compare/v1.0.0/Isabelle2019...v1.1.0/Isabelle2021

View File

@ -513,9 +513,9 @@ text*[T2::technical]\<open>
\<^enum> \<^ML>\<open>HOLogic.mk_mem : term * term -> term\<close> \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.dest_mem : term -> term * term\<close> \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.mk_set : typ -> term list -> term\<close> \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.conj_intr : Proof.context -> thm -> thm -> thm\<close>, some HOL-level derived-inferences \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.conj_elim : Proof.context -> thm -> thm * thm\<close> \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.conj_elims : Proof.context -> thm -> thm list\<close> \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.conj_intr : thm -> thm -> thm\<close>, some HOL-level derived-inferences \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.conj_elim : thm -> thm * thm\<close> \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.conj_elims : thm -> thm list\<close> \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.conj : term\<close> , some ML level logical constructors \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.disj : term\<close> \<^vs>\<open>-0.2cm\<close>
\<^enum> \<^ML>\<open>HOLogic.imp : term\<close> \<^vs>\<open>-0.2cm\<close>
@ -1549,7 +1549,6 @@ Position.reports: Position.report list -> unit;
Markup.entity : string -> string -> Markup.T;
Markup.properties : Properties.T -> Markup.T -> Markup.T ;
Properties.get : Properties.T -> string -> string option;
Markup.enclose : Markup.T -> string -> string;
(* example for setting a link, the def flag controls if it is a defining or a binding
occurence of an item *)
@ -2011,7 +2010,6 @@ and, from there, the generation of (non-layouted) strings :
\<^item> \<^ML>\<open>Syntax.pretty_arity: Proof.context -> arity -> Pretty.T\<close>
\<^item> \<^ML>\<open>Syntax.string_of_term: Proof.context -> term -> string \<close>
\<^item> \<^ML>\<open>Syntax.string_of_typ: Proof.context -> typ -> string \<close>
\<^item> \<^ML>\<open>Syntax.lookup_const : Syntax.syntax -> string -> string option\<close>
\<close>
@ -2089,11 +2087,8 @@ text\<open>where we have the registration of the action
section \<open> Output: Very Low Level \<close>
text\<open> For re-directing the output channels, the structure \<^ML_structure>\<open>Output\<close> may be relevant:
\<^ML>\<open>Output.output:string -> string\<close> is the structure for the "hooks" with the target devices.
\<close>
text\<open> For re-directing the output channels, the structure \<^ML_structure>\<open>Output\<close> may be relevant.\<close>
ML\<open> Output.output "bla_1:" \<close>
text\<open>It provides a number of hooks that can be used for redirection hacks ...\<close>
@ -2240,8 +2235,8 @@ ML\<open>\<comment> \<open>Dynamic setup of inner syntax cartouche\<close>
let fun err () = raise TERM ("string_tr", args) in
(case args of
[(c as Const (@{syntax_const "_constrain"}, _)) $ Free (s, _) $ p] =>
(case Term_Position.decode_position p of
SOME (pos, _) => c $ f (mk_string f_mk accu (content (s, pos))) $ p
(case Term_Position.decode_position1 p of
SOME {pos, ...} => c $ f (mk_string f_mk accu (content (s, pos))) $ p
| NONE => err ())
| _ => err ())
end;

View File

@ -403,7 +403,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
term*\<open>@{thm \<open>Reification_Test.identity_conc\<close>}\<close>
@ -434,7 +434,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
declare[[show_sorts = false]]
@ -458,7 +458,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
prf test
@ -485,7 +485,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
prf test2
@ -510,7 +510,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
prf test3
@ -535,7 +535,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
prf test4
@ -560,7 +560,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
prf symmetric
@ -635,7 +635,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
ML\<open>
val thy = \<^theory>;
@ -700,7 +700,7 @@ ML\<open>
(*all theorems used in the graph of nested proofs*)
val all_thms =
Proofterm.fold_body_thms
(fn {name, ...} => insert (op =) name) [body] [];
(fn {thm_name, ...} => insert (op =) thm_name) [body] [];
\<close>
ML\<open>

View File

@ -162,12 +162,12 @@ fun ML_isa_elaborate_thms_of (thy:theory) _ _ term_option pos =
val thm_name = HOLogic.dest_string term
val thm = Proof_Context.get_thm (Proof_Context.init_global thy) thm_name
val body = Proofterm.strip_thm_body (Thm.proof_body_of thm)
val all_thms_name = Proofterm.fold_body_thms (fn {name, ...} => insert (op =) name) [body] []
val all_thms_name = Proofterm.fold_body_thms (fn {thm_name, ...} => insert (op =) thm_name) [body] []
(*val all_thms = map (Proof_Context.get_thm (Proof_Context.init_global thy)) all_thms_name*)
(*val all_proofs = map (Proof_Syntax.standard_proof_of
{full = true, expand_name = Thm.expand_name thm}) all_thms*)
(*in HOLogic.mk_list \<^Type>\<open>thm\<close> (map (fn proof => \<^Const>\<open>Thm_content\<close> $ reify_proofterm proof) all_proofs) end*)
in HOLogic.mk_list \<^typ>\<open>string\<close> (map HOLogic.mk_string all_thms_name) end
in HOLogic.mk_list \<^typ>\<open>string\<close> (map HOLogic.mk_string (map (Proof_Context.print_thm_name (Proof_Context.init_global thy)) all_thms_name)) end
fun ML_isa_elaborate_trace_attribute (thy:theory) _ _ term_option pos =
case term_option of

View File

@ -0,0 +1,20 @@
theory
"Isabelle_DOF-Scaffold"
imports
Isabelle_DOF.scholarly_paper
begin
list_templates
use_template "scrartcl"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -0,0 +1,9 @@
session "Isabelle_DOF-Scaffold" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"Isabelle_DOF-Scaffold"
document_files
"preamble.tex"

View File

@ -0,0 +1 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -7,7 +7,7 @@ Isabelle/DOF allows for both conventional typesetting and formal development.
Isabelle/DOF has two major prerequisites:
* **Isabelle 2024:** Isabelle/DOF requires [Isabelle](https://isabelle.in.tum.de/)
* **Isabelle 2025:** Isabelle/DOF requires [Isabelle](https://isabelle.in.tum.de/)
and several entries from the [Archive of Formal Proofs
(AFP)](https://www.isa-afp.org/).
* **LaTeX:** Isabelle/DOF requires a modern LaTeX installation, i.e., at least
@ -33,7 +33,12 @@ Isabelle/DOF is provided as one AFP entry:
* [Isabelle_DOF:](https://www.isa-afp.org/entries/Isabelle_DOF.html) This entry
contains the Isabelle/DOF system itself, including the Isabelle/DOF manual.
### Installation of the Development Version (Git Repository)
### Installation of Add-On Packages and Examples
Add-ons to Isabelle/DOF, in particular, additional ontologies, document templates,
and examples, are provided on [Zenodo](https://doi.org/10.5281/zenodo.3370482).
## Installation of the Development Version (Git Repository)
The development version of Isabelle/DOF that is available in this Git repository
provides, over the AFP version, additional ontologies, document templates, and
@ -65,9 +70,8 @@ foo@bar:~$ isabelle build -d . -o 'timeout_scale=2' Isabelle_DOF-Proofs
## Usage
In the following, we assume that you installed Isabelle/DOF either from the AFP
(adding the AFP as a component to your Isabelle system) or from the Git
repository of Isabelle/DOF (installing Isabelle/DOF as a component to your
Isabelle system).
(adding the AFP as a component to your Isabelle system) with the add-ons
available on Zenodo (or from the Git repository of Isabelle/DOF).
Assuming that your current directory contains the example academic paper in the
subdirectory ``Isabelle_DOF-Example-I/``, you can open it similar
@ -121,12 +125,9 @@ in a subdirectory:
## Releases
For releases, signed archives including a PDF version of the Isabelle/DOF manual
are available. The latest release is **Isabelle/DOF 1.3.0/Isabelle2021-1**:
* [Isabelle_DOF-1.3.0_Isabelle2021-1.pdf](https://artifacts.logicalhacking.com/releases/Isabelle_DOF/Isabelle_DOF/Isabelle_DOF-1.3.0_Isabelle2021-1.pdf)
* [Isabelle_DOF-1.3.0_Isabelle2021-1.tar.xz](https://artifacts.logicalhacking.com/releases/Isabelle_DOF/Isabelle_DOF/Isabelle_DOF-1.3.0_Isabelle2021-1.tar.xz)
* [Isabelle_DOF-1.3.0_Isabelle2021-1.tar.xz.asc](https://artifacts.logicalhacking.com/releases/Isabelle_DOF/Isabelle_DOF/Isabelle_DOF-1.3.0_Isabelle2021-1.tar.xz.asc)
Since Isabelle 2024, Isabelle/DOF synchronises its releases with the releases
of Isabelle. The core of Isabelle/DOF is part of the Archive of Formal Proofs
and several add-on packages are released on Zenodo.
Older releases are available [here.](https://artifacts.logicalhacking.com/releases/Isabelle_DOF/Isabelle_DOF/)
@ -152,6 +153,19 @@ SPDX-License-Identifier: BSD-2-Clause
## Publications
* Achim D. Brucker, Idir Aït-Sadoune, Nicolas Méric, Burkhart Wolff: Parametric
ontologies in formal software engineering. Sci. Comput. Program. 241: 103231
(2025). [do:10.1016/j.scico.2024.103231](https://doi.org/10.1016/j.scico.2024.103231)
* Nicolas Méric: An Ontology Framework for Formal Libraries: Doctoral Thesis at
the University Pris-Saclay. (Conception et Implémentation d'un Environnement
d'Ontologie pour des Bibliothèques Formelles). University of Paris-Saclay,
France, 2024. <https://tel.archives-ouvertes.fr/tel-04870527>
* Achim D. Brucker, Idir Aït-Sadoune, Nicolas Méric, Burkhart Wolff: Using Deep
Ontologies in Formal Software Engineering. ABZ 2023: 15-32.
[doi:10.1007/978-3-031-33163-3_2](https://doi.org/10.1007/978-3-031-33163-3_2)
* Achim D. Brucker, Idir Ait-Sadoune, Paolo Crisafulli, and Burkhart Wolff.
[Using The Isabelle Ontology Framework: Linking the Formal with the
Informal](https://www.brucker.ch/bibliography/download/2018/brucker.ea-isabelle-ontologies-2018.pdf).

View File

@ -4,12 +4,10 @@
Isabelle/DOF has three major prerequisites:
* **Isabelle:** Isabelle/DOF requires [Isabelle
2024](https://isabelle.in.tum.de/website-Isabelle2024/). Please download the
Isabelle 2024 distribution for your operating system from the [Isabelle
website](https://isabelle.in.tum.de/website-Isabelle2024/).
* **AFP:** Isabelle/DOF requires several entries from the [Archive of Formal Proofs
(AFP)](https://www.isa-afp.org/).
* **Isabelle:** Isabelle/DOF requires [Isabelle 2025]
(https://isabelle.in.tum.de).
* **AFP:** Isabelle/DOF requires several entries from the [development version of the Archive of Formal Proofs
(AFP)](https://devel.isa-afp.org/).
* **LaTeX:** Isabelle/DOF requires a modern LaTeX installation, i.e., at least
[TeX Live 2022](https://www.tug.org/texlive/) with all available updates
applied.

1
ROOTS
View File

@ -6,3 +6,4 @@ Isabelle_DOF-Example-I
Isabelle_DOF-Example-II
Isabelle_DOF-Examples-Extra
Isabelle_DOF-Examples-Templates
Isabelle_DOF-Scaffold

View File

@ -1,7 +1,7 @@
chapter AFP
session "Isabelle_DOF" (AFP) = "Functional-Automata" +
options [document_build = dof, timeout = 300]
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
sessions
"Regular-Sets"
directories
@ -54,6 +54,7 @@ session "Isabelle_DOF" (AFP) = "Functional-Automata" +
"figures/doc-mod-DOF.pdf"
"figures/doc-mod-term-aq.pdf"
"figures/ThisPaperWithPreviewer.png"
"figures/DockingDocumentPanel.png"
export_classpath

Binary file not shown.

After

Width:  |  Height:  |  Size: 51 KiB

View File

@ -24,10 +24,6 @@
\IfFileExists{hvlogos.sty}{\usepackage{hvlogos}}{\newcommand{\TeXLive}{\TeX Live}\newcommand{\BibTeX}{Bib\TeX}}
\usepackage{railsetup}
\setcounter{secnumdepth}{2}
\usepackage{index}
\newcommand{\bindex}[1]{\index{#1|textbf}}
%\makeindex
%\AtEndDocument{\printindex}
\newcommand{\dof}{DOF\xspace}
\newcommand{\isactrlemph}{*}
@ -112,9 +108,10 @@ France, and therefore granted with public funds of the Program ``Investissements
\end{minipage}
}
% Index setup
\usepackage{index}
\makeindex
\AtEndDocument{\printindex}
%\usepackage{index}
\newcommand{\bindex}[1]{\index{#1|textbf}}
%\makeindex
%\AtEndDocument{\printindex}
\newcommand{\DOFindex}[2]{%
\marginnote{\normalfont\textbf{#1}: #2}%
@ -122,3 +119,5 @@ France, and therefore granted with public funds of the Program ``Investissements
}%
\AtBeginDocument{\isabellestyle{literal}\newcommand{\lstnumberautorefname}{Line}}
\endinput
%:%file=$AFP/Isabelle_DOF/document/preamble.tex%:%

View File

@ -284,7 +284,7 @@ doc_class "corollary" = math_content +
invariant d :: "mcc \<sigma> = corr"
text\<open>A premise or premiss[a] is a proposition — a true or false declarative statement—
text\<open>A premise or premiss[a] is a proposition --- a true or false declarative statement---
used in an argument to prove the truth of another proposition called the conclusion.\<close>
doc_class "premise" = math_content +
referentiable :: bool <= "True"
@ -678,9 +678,9 @@ section\<open>Miscelleous\<close>
subsection\<open>Common Abbreviations\<close>
define_shortcut* eg \<rightleftharpoons> \<open>\eg\<close> (* Latin: „exempli gratia“ meaning „for example“. *)
ie \<rightleftharpoons> \<open>\ie\<close> (* Latin: „id est“ meaning „that is to say“. *)
etc \<rightleftharpoons> \<open>\etc\<close> (* Latin : „et cetera“ meaning „et cetera“ *)
define_shortcut* eg \<rightleftharpoons> \<open>\eg\<close> (* Latin: "exempli gratia" meaning "for example". *)
ie \<rightleftharpoons> \<open>\ie\<close> (* Latin: "id est" meaning "that is to say". *)
etc \<rightleftharpoons> \<open>\etc\<close> (* Latin : "et cetera" meaning "et cetera" *)
print_doc_classes

View File

@ -85,10 +85,10 @@ doc_class report =
section\<open>Experimental\<close>
(* switch on regexp syntax *)
notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
notation Plus (infixr "||" 55)
notation Times (infixr "~~" 60)
notation Atom ("\<lfloor>_\<rfloor>" 65)
notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
notation Plus (infixr \<open>||\<close> 55)
notation Times (infixr \<open>~~\<close> 60)
notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)

View File

@ -39,10 +39,10 @@ import isabelle._
object DOF {
/** parameters **/
val isabelle_version = "2024"
val isabelle_url = "https://isabelle.in.tum.de/website-Isabelle2024"
val isabelle_version = "2025"
val isabelle_url = "https://isabelle.sketis.net/devel/release_snapshot/"
val afp_version = "afp-2024-06-05"
val afp_version = "afp-2025-03-16"
// Isabelle/DOF version: "Unreleased" for development, semantic version for releases
val version = "Unreleased"
@ -50,8 +50,8 @@ object DOF {
val session = "Isabelle_DOF"
val session_ontologies = "Isabelle_DOF-Ontologies"
val latest_version = "1.3.0"
val latest_isabelle = "Isabelle2021-1"
val latest_version = "2025"
val latest_isabelle = "Isabelle2025"
val latest_doi = "10.5281/zenodo.6810799"
val generic_doi = "10.5281/zenodo.3370482"

View File

@ -410,7 +410,7 @@ fun convert_meta_args ctxt (X, (((str,_),value) :: R)) =
let fun conv_int x = snd(HOLogic.dest_number(Syntax.read_term ctxt x))
handle TERM _ => error "Illegal int format."
in
(case YXML.content_of str of
(case Protocol_Message.clean_output str of
"relative_width" => upd_relative_width (K (conv_int value))
o convert_meta_args ctxt (X, R)
| "relative_height" => upd_relative_height (K (conv_int value))
@ -421,7 +421,7 @@ fun convert_meta_args ctxt (X, (((str,_),value) :: R)) =
|convert_meta_args _ (_,[]) = I
fun convert_src_from_margs ctxt (X, (((str,_),value)::R)) =
(case YXML.content_of str of
(case Protocol_Message.clean_output str of
"file_src" => Input.string (HOLogic.dest_string (Syntax.read_term ctxt value))
| _ => convert_src_from_margs ctxt (X,R))
|convert_src_from_margs _ (_, []) = error("No file_src provided.")
@ -851,12 +851,12 @@ val _ = block_antiquotation \<^binding>\<open>block\<close> (block_modes -- Scan
|> Theory.setup
fun convert_meta_args ctxt (X, (((str,_),value) :: R)) =
(case YXML.content_of str of
"frametitle" => upd_frametitle (K(YXML.content_of value |> read_string))
(case Protocol_Message.clean_output str of
"frametitle" => upd_frametitle (K(Protocol_Message.clean_output value |> read_string))
o convert_meta_args ctxt (X, R)
| "framesubtitle" => upd_framesubtitle (K(YXML.content_of value |> read_string))
| "framesubtitle" => upd_framesubtitle (K(Protocol_Message.clean_output value |> read_string))
o convert_meta_args ctxt (X, R)
| "options" => upd_options (K(YXML.content_of value |> read_string))
| "options" => upd_options (K(Protocol_Message.clean_output value |> read_string))
o convert_meta_args ctxt (X, R)
| s => error("!undefined attribute:"^s))
| convert_meta_args _ (_,[]) = I

View File

@ -124,7 +124,7 @@ struct
Name_Space.check (Context.Theory thy)
(get_onto_classes (Proof_Context.init_global thy)) (name, Position.none) |> #2
fun markup2string s = YXML.content_of s
fun markup2string s = Protocol_Message.clean_output s
|> Symbol.explode
|> List.filter (fn c => c <> Symbol.DEL)
|> String.concat
@ -876,6 +876,14 @@ val long_doc_class_prefix = ISA_prefix ^ "long_doc_class_"
fun is_ISA s = String.isPrefix ISA_prefix (Long_Name.base_name s)
val strip_positions_is_ISA =
let
fun strip ((t as Const ("_type_constraint_", \<^Type>\<open>fun A _\<close>)) $ (u as Const (s,_))) =
if Term_Position.detect_positionT A andalso is_ISA s then u else t $ u
| strip (t $ u) = strip t $ strip u
| strip (Abs (a, T, b)) = Abs (a, T, strip b)
| strip t = t;
in strip end;
fun transduce_term_global {mk_elaboration=mk_elaboration} (term,pos) thy =
(* pre: term should be fully typed in order to allow type-related term-transformations *)
@ -908,24 +916,27 @@ fun transduce_term_global {mk_elaboration=mk_elaboration} (term,pos) thy =
else Const(s, ty)
| T(Abs(s,ty,t)) = Abs(s,ty,T t)
| T t = t
in T term end
in T (strip_positions_is_ISA term) end
(* Elaborate an Isabelle_DOF term-antiquotation from an only parsed-term (not checked) *)
fun parsed_elaborate ctxt pos (Const(s,ty) $ t) =
if is_ISA s
then Syntax.check_term ctxt (Const(s, ty) $ t)
|> (fn t => transduce_term_global {mk_elaboration=true} (t , pos)
(Proof_Context.theory_of ctxt))
else (Const(s,ty) $ (parsed_elaborate ctxt pos t))
| parsed_elaborate ctxt pos (t1 $ t2) = parsed_elaborate ctxt pos (t1) $ parsed_elaborate ctxt pos (t2)
| parsed_elaborate ctxt pos (Const(s,ty)) =
if is_ISA s
then Syntax.check_term ctxt (Const(s, ty))
|> (fn t => transduce_term_global {mk_elaboration=true} (t , pos)
(Proof_Context.theory_of ctxt))
else Const(s,ty)
| parsed_elaborate ctxt pos (Abs(s,ty,t)) = Abs (s,ty,parsed_elaborate ctxt pos t)
| parsed_elaborate _ _ t = t
fun parsed_elaborate ctxt pos =
let
fun elaborate (Const(s,ty) $ t) =
if is_ISA s
then Syntax.check_term ctxt (Const(s, ty) $ t)
|> (fn t => transduce_term_global {mk_elaboration=true} (t , pos)
(Proof_Context.theory_of ctxt))
else (Const(s,ty) $ (elaborate t))
| elaborate (t1 $ t2) = elaborate (t1) $ elaborate (t2)
| elaborate (Const(s,ty)) =
if is_ISA s
then Syntax.check_term ctxt (Const(s, ty))
|> (fn t => transduce_term_global {mk_elaboration=true} (t , pos)
(Proof_Context.theory_of ctxt))
else Const(s,ty)
| elaborate (Abs(s,ty,t)) = Abs (s,ty,elaborate t)
| elaborate t = t
in elaborate o strip_positions_is_ISA end
fun elaborate_term' ctxt parsed_term = parsed_elaborate ctxt Position.none parsed_term
@ -938,23 +949,26 @@ fun check_term ctxt term = transduce_term_global {mk_elaboration=false}
(Proof_Context.theory_of ctxt)
(* Check an Isabelle_DOF term-antiquotation from an only parsed-term (not checked) *)
fun parsed_check ctxt pos (Const(s,ty) $ t) =
if is_ISA s
then let val _ = Syntax.check_term ctxt (Const(s, ty) $ t)
|> (fn t => transduce_term_global {mk_elaboration=false} (t , pos)
(Proof_Context.theory_of ctxt))
in (Const(s,ty) $ (parsed_check ctxt pos t)) end
else (Const(s,ty) $ (parsed_check ctxt pos t))
| parsed_check ctxt pos (t1 $ t2) = parsed_check ctxt pos (t1) $ parsed_check ctxt pos (t2)
| parsed_check ctxt pos (Const(s,ty)) =
if is_ISA s
then let val _ = Syntax.check_term ctxt (Const(s, ty))
|> (fn t => transduce_term_global {mk_elaboration=false} (t , pos)
(Proof_Context.theory_of ctxt))
in Const(s,ty) end
else Const(s,ty)
| parsed_check ctxt pos (Abs(s,ty,t)) = Abs (s,ty,parsed_check ctxt pos t)
| parsed_check _ _ t = t
fun parsed_check ctxt pos =
let
fun check (Const(s,ty) $ t) =
if is_ISA s
then let val _ = Syntax.check_term ctxt (Const(s, ty) $ t)
|> (fn t => transduce_term_global {mk_elaboration=false} (t , pos)
(Proof_Context.theory_of ctxt))
in (Const(s,ty) $ (check t)) end
else (Const(s,ty) $ (check t))
| check (t1 $ t2) = check (t1) $ check (t2)
| check (Const(s,ty)) =
if is_ISA s
then let val _ = Syntax.check_term ctxt (Const(s, ty))
|> (fn t => transduce_term_global {mk_elaboration=false} (t , pos)
(Proof_Context.theory_of ctxt))
in Const(s,ty) end
else Const(s,ty)
| check (Abs(s,ty,t)) = Abs (s,ty,check t)
| check t = t
in check o strip_positions_is_ISA end
fun check_term' ctxt parsed_term = parsed_check ctxt Position.none parsed_term
@ -1050,15 +1064,15 @@ fun trace_attr_t cid oid =
\<comment> \<open>and others in the future : file, http, thy, ...\<close>
datatype "typ" = Isabelle_DOF_typ string ("@{typ _}")
datatype "term" = Isabelle_DOF_term string ("@{term _}")
datatype "thm" = Isabelle_DOF_thm string ("@{thm _}")
datatype "file" = Isabelle_DOF_file string ("@{file _}")
datatype "thy" = Isabelle_DOF_thy string ("@{thy _}")
consts Isabelle_DOF_docitem :: "string \<Rightarrow> 'a" ("@{docitem _}")
datatype "docitem_attr" = Isabelle_DOF_docitem_attr string string ("@{docitemattr (_) :: (_)}")
consts Isabelle_DOF_trace_attribute :: "string \<Rightarrow> (string * string) list" ("@{trace'_attribute _}")
consts Isabelle_DOF_instances_of :: "string \<Rightarrow> 'a list" ("@{instances'_of _}")
datatype "typ" = Isabelle_DOF_typ string (\<open>@{typ _}\<close>)
datatype "term" = Isabelle_DOF_term string (\<open>@{term _}\<close>)
datatype "thm" = Isabelle_DOF_thm string (\<open>@{thm _}\<close>)
datatype "file" = Isabelle_DOF_file string (\<open>@{file _}\<close>)
datatype "thy" = Isabelle_DOF_thy string (\<open>@{thy _}\<close>)
consts Isabelle_DOF_docitem :: "string \<Rightarrow> 'a" (\<open>@{docitem _}\<close>)
datatype "docitem_attr" = Isabelle_DOF_docitem_attr string string (\<open>@{docitemattr (_) :: (_)}\<close>)
consts Isabelle_DOF_trace_attribute :: "string \<Rightarrow> (string * string) list" (\<open>@{trace'_attribute _}\<close>)
consts Isabelle_DOF_instances_of :: "string \<Rightarrow> 'a list" (\<open>@{instances'_of _}\<close>)
\<comment> \<open>Dynamic setup of inner syntax cartouche\<close>
@ -1082,15 +1096,15 @@ ML \<open>
let fun err () = raise TERM ("string_tr", args) in
(case args of
[(c as Const (@{syntax_const "_constrain"}, _)) $ Free (s, _) $ p] =>
(case Term_Position.decode_position p of
SOME (pos, _) => c $ f (mk_string f_mk accu (content (s, pos))) $ p
(case Term_Position.decode_position1 p of
SOME {pos, ...} => c $ f (mk_string f_mk accu (content (s, pos))) $ p
| NONE => err ())
| _ => err ())
end;
end;
\<close>
syntax "_cartouche_string" :: "cartouche_position \<Rightarrow> _" ("_")
syntax "_cartouche_string" :: "cartouche_position \<Rightarrow> _" (\<open>_\<close>)
ML\<open>
structure Cartouche_Grammar = struct
@ -1285,15 +1299,14 @@ fun declare_ISA_class_accessor_and_check_instance (params, doc_class_name, bind_
fun mixfix_enclose name = name |> enclose "@{" " _}"
val mixfix = clean_mixfix bname |> mixfix_enclose
val mixfix' = clean_mixfix doc_class_name |> mixfix_enclose
fun add_const (b, T, mx) =
Sign.add_consts [(b, T, mx)] #>
DOF_core.add_isa_transformer b
((check_instance, elaborate_instance) |> DOF_core.make_isa_transformer)
in
thy |> rm_mixfix bname' mixfix
|> Sign.add_consts [(bind, const_typ, Mixfix.mixfix mixfix)]
|> DOF_core.add_isa_transformer bind ((check_instance, elaborate_instance)
|> DOF_core.make_isa_transformer)
|> Sign.add_consts [(bind', const_typ, Mixfix.mixfix mixfix')]
|> DOF_core.add_isa_transformer bind' ((check_instance, elaborate_instance)
|> DOF_core.make_isa_transformer)
|> add_const (bind, const_typ, Mixfix.mixfix mixfix)
|> add_const (bind', const_typ, Mixfix.mixfix mixfix')
end
fun elaborate_instances_of thy _ _ term_option _ =
@ -1900,7 +1913,7 @@ fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} bi
without using the burden of ontology classes.
ex: text*[sdf]\<open> Lorem ipsum @{thm refl}\<close> *)
else let
fun conv_attrs ((lhs, pos), rhs) = (YXML.content_of lhs,pos,"=", Syntax.parse_term (Proof_Context.init_global thy) rhs)
fun conv_attrs ((lhs, pos), rhs) = (Protocol_Message.clean_output lhs,pos,"=", Syntax.parse_term (Proof_Context.init_global thy) rhs)
val assns' = map conv_attrs doc_attrs
val defaults_init = create_default_object thy binding cid_long typ
fun conv (na, _(*ty*), parsed_term) =(Binding.name_of na, Binding.pos_of na, "=", parsed_term);
@ -2116,7 +2129,7 @@ fun update_instance_command ((binding, cid_pos),
val _ = if cid' = DOF_core.default_cid orelse cid = cid'
then ()
else error("incompatible classes:"^cid^":"^cid')
fun conv_attrs (((lhs, pos), opn), rhs) = ((YXML.content_of lhs),pos,opn,
fun conv_attrs (((lhs, pos), opn), rhs) = ((Protocol_Message.clean_output lhs),pos,opn,
Syntax.parse_term (Proof_Context.init_global thy) rhs)
val assns' = map conv_attrs doc_attrs
val def_trans_value =
@ -2258,9 +2271,9 @@ fun meta_args_2_latex thy sem_attrs transform_attr
fun markup2string s = String.concat (List.filter (fn c => c <> Symbol.DEL)
(Symbol.explode (YXML.content_of s)))
(Symbol.explode (Protocol_Message.clean_output s)))
fun ltx_of_markup ctxt s = let
val term = (Syntax.check_term ctxt o Syntax.parse_term ctxt) s
val term = (Syntax.check_term ctxt o Syntax.parse_term ctxt) s
val str_of_term = ltx_of_term ctxt true term
(* handle _ => "Exception in ltx_of_term" *)
in
@ -2270,7 +2283,7 @@ fun meta_args_2_latex thy sem_attrs transform_attr
val ctxt = Proof_Context.init_global thy
val actual_args = map (fn ((lhs,_),rhs) => (toLong lhs, ltx_of_markup ctxt rhs))
attr_list
val default_args =
val default_args =
(DOF_core.get_attribute_defaults cid_long thy)
|> map (fn (b,_, parsed_term) =>
(toLong (Long_Name.base_name ( Sign.full_name thy b))
@ -2560,13 +2573,13 @@ fun get_positions ctxt x =
fun get Cs (Const ("_type_constraint_", C) $ t) = get (C :: Cs) t
| get Cs (Free (y, T)) =
if x = y then
map_filter Term_Position.decode_positionT
maps Term_Position.decode_positionT
(T :: map (Type.constraint_type ctxt) Cs)
else []
| get _ (t $ u) = get [] t @ get [] u
| get _ (Abs (_, _, t)) = get [] t
| get _ _ = [];
in get [] end;
in map #pos o get [] end;
fun dummy_frees ctxt xs tss =
let
@ -3117,7 +3130,7 @@ fun add_doc_class_cmd overloaded (raw_params, binding)
val params = map (apsnd (Typedecl.read_constraint ctxt)) raw_params;
val ctxt1 = fold (Variable.declare_typ o TFree) params ctxt;
fun markup2string s = String.concat (List.filter (fn c => c <> Symbol.DEL)
(Symbol.explode (YXML.content_of s)))
(Symbol.explode (Protocol_Message.clean_output s)))
val name' =
case raw_parent of
NONE => DOF_core.default_cid
@ -3230,8 +3243,8 @@ fun add_onto_morphism classes_mappings eqs thy =
val converts =
map (fn (oclasses, dclass) =>
let
val oclasses_string = map YXML.content_of oclasses
val dclass_string = YXML.content_of dclass
val oclasses_string = map Protocol_Message.clean_output oclasses
val dclass_string = Protocol_Message.clean_output dclass
val const_sub_name = dclass_string
|> (oclasses_string |> fold_rev (fn x => fn y => x ^ "_" ^ y))
|> String.explode |> map (fn x => "\<^sub>" ^ (String.str x)) |> String.concat

View File

@ -41,15 +41,15 @@ text\<open> The implementation of the monitoring concept follows the following d
section\<open>Monitor Syntax over RegExp - constructs\<close>
notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
notation Plus (infixr "||" 55)
notation Times (infixr "~~" 60)
notation Atom ("\<lfloor>_\<rfloor>" 65)
notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
notation Plus (infixr \<open>||\<close> 55)
notation Times (infixr \<open>~~\<close> 60)
notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
definition rep1 :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrace>(_)\<rbrace>\<^sup>+")
definition rep1 :: "'a rexp \<Rightarrow> 'a rexp" (\<open>\<lbrace>(_)\<rbrace>\<^sup>+\<close>)
where "\<lbrace>A\<rbrace>\<^sup>+ \<equiv> A ~~ \<lbrace>A\<rbrace>\<^sup>*"
definition opt :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrakk>(_)\<rbrakk>")
definition opt :: "'a rexp \<Rightarrow> 'a rexp" (\<open>\<lbrakk>(_)\<rbrakk>\<close>)
where "\<lbrakk>A\<rbrakk> \<equiv> A || One"
value "Star (Conc(Alt (Atom(CHR ''a'')) (Atom(CHR ''b''))) (Atom(CHR ''c'')))"
@ -302,12 +302,12 @@ lemma mono_rep1_star':"L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub
using mono_Star' rep1_star_incl' by blast
no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
no_notation Plus (infixr "||" 55)
no_notation Times (infixr "~~" 60)
no_notation Atom ("\<lfloor>_\<rfloor>" 65)
no_notation rep1 ("\<lbrace>(_)\<rbrace>\<^sup>+")
no_notation opt ("\<lbrakk>(_)\<rbrakk>")
no_notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
no_notation Plus (infixr \<open>||\<close> 55)
no_notation Times (infixr \<open>~~\<close> 60)
no_notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
no_notation rep1 (\<open>\<lbrace>(_)\<rbrace>\<^sup>+\<close>)
no_notation opt (\<open>\<lbrakk>(_)\<rbrakk>\<close>)
ML\<open>
structure RegExpInterface_Notations =

View File

@ -23,7 +23,7 @@ begin
(*<*)
definition combinator1 :: "'a \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> 'b" (infixl "|>" 65)
definition combinator1 :: "'a \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> 'b" (infixl \<open>|>\<close> 65)
where "x |> f = f x"

View File

@ -190,7 +190,7 @@ onto_morphism (acm) to elsevier
text\<open>Here is a more basic, but equivalent definition for the other way round:\<close>
definition elsevier_to_acm_morphism :: "elsevier_org \<Rightarrow> acm_org"
("_ \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r" [1000]999)
(\<open>_ \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r\<close> [1000]999)
where "\<sigma> \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r = \<lparr> acm_org.tag_attribute = 1::int,
acm_org.position = ''no position'',
acm_org.institution = organization \<sigma>,
@ -318,19 +318,19 @@ the presentation of them) is very different. Besides, morphisms functions can be
(\<^ie> surjective), ``embedding'' (\<^ie> injective) or even ``one-to-one'' ((\<^ie> bijective).\<close>
definition Item_to_Resource_morphism :: "Item \<Rightarrow> Resource"
("_ \<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m" [1000]999)
(\<open>_ \<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m\<close> [1000]999)
where " \<sigma> \<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m =
\<lparr> Resource.tag_attribute = 1::int ,
Resource.name = name \<sigma> \<rparr>"
definition Product_to_Resource_morphism :: "Product \<Rightarrow> Resource"
("_ \<langle>Resource\<rangle>\<^sub>P\<^sub>r\<^sub>o\<^sub>d\<^sub>u\<^sub>c\<^sub>t" [1000]999)
(\<open>_ \<langle>Resource\<rangle>\<^sub>P\<^sub>r\<^sub>o\<^sub>d\<^sub>u\<^sub>c\<^sub>t\<close> [1000]999)
where " \<sigma> \<langle>Resource\<rangle>\<^sub>P\<^sub>r\<^sub>o\<^sub>d\<^sub>u\<^sub>c\<^sub>t =
\<lparr> Resource.tag_attribute = 2::int ,
Resource.name = name \<sigma> \<rparr>"
definition Computer_Software_to_Software_morphism :: "Computer_Software \<Rightarrow> Software"
("_ \<langle>Software\<rangle>\<^sub>S\<^sub>o\<^sub>f\<^sub>t\<^sub>C\<^sub>m\<^sub>p" [1000]999)
(\<open>_ \<langle>Software\<rangle>\<^sub>S\<^sub>o\<^sub>f\<^sub>t\<^sub>C\<^sub>m\<^sub>p\<close> [1000]999)
where "\<sigma> \<langle>Software\<rangle>\<^sub>S\<^sub>o\<^sub>f\<^sub>t\<^sub>C\<^sub>m\<^sub>p =
\<lparr> Resource.tag_attribute = 3::int ,
Resource.name = name \<sigma> ,
@ -339,7 +339,7 @@ definition Computer_Software_to_Software_morphism :: "Computer_Software \<Righta
Software.version = version \<sigma> \<rparr>"
definition Electronic_Component_to_Component_morphism :: "Electronic_Component \<Rightarrow> Component"
("_ \<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p" [1000]999)
(\<open>_ \<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p\<close> [1000]999)
where "\<sigma> \<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p =
\<lparr> Resource.tag_attribute = 4::int ,
Resource.name = name \<sigma> ,
@ -348,7 +348,7 @@ definition Electronic_Component_to_Component_morphism :: "Electronic_Component \
Component.mass = mass \<sigma> \<rparr>"
definition Monitor_to_Hardware_morphism :: "Monitor \<Rightarrow> Hardware"
("_ \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e" [1000]999)
(\<open>_ \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e\<close> [1000]999)
where "\<sigma> \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e =
\<lparr> Resource.tag_attribute = 5::int ,
Resource.name = name \<sigma> ,
@ -390,10 +390,10 @@ subsection\<open>Proving Monitor-Refinements\<close>
(*<*)
(* switch on regexp syntax *)
notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
notation Plus (infixr "||" 55)
notation Times (infixr "~~" 60)
notation Atom ("\<lfloor>_\<rfloor>" 65)
notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
notation Plus (infixr \<open>||\<close> 55)
notation Times (infixr \<open>~~\<close> 60)
notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
(*>*)
@ -442,10 +442,10 @@ of the above language refinement is quasi automatic. This proof is also part of
(*<*)
(* switch off regexp syntax *)
no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
no_notation Plus (infixr "||" 55)
no_notation Times (infixr "~~" 60)
no_notation Atom ("\<lfloor>_\<rfloor>" 65)
no_notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
no_notation Plus (infixr \<open>||\<close> 55)
no_notation Times (infixr \<open>~~\<close> 60)
no_notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
end
(*>*)
(*>*)

View File

@ -614,7 +614,7 @@ subsubsection\<open>Examples\<close>
text\<open>
\<^item> common short-cut hiding \<^LaTeX> code in the integrated source:
@{theory_text [display]
\<open>define_shortcut* eg \<rightleftharpoons> \<open>\eg\<close> (* Latin: „exempli gratia“ meaning „for example“. *)
\<open>define_shortcut* eg \<rightleftharpoons> \<open>\eg\<close> (* Latin: "exempli gratia" meaning "for example". *)
clearpage \<rightleftharpoons> \<open>\clearpage{}\<close>\<close>}
\<^item> non-checking macro:
@{theory_text [display]
@ -1369,18 +1369,44 @@ value*\<open>filter (\<lambda>\<sigma>. the (text_section.level \<sigma>) > 1) @
\<close>
section*[infrastructure::technical]\<open>Technical Infrastructure\<close>
figure*[
dock_document_panel::figure
, relative_width="65"
, file_src="''figures/DockingDocumentPanel.png''"
]\<open>Re-configuring the docking options of the Document Panel.\<close>
subsection\<open>The Previewer\<close>
(*<*)
declare_reference*[global_DOF_view::figure]
(*>*)
subsection\<open>Activating the Previewer\<close>
text\<open>The \<^isadof> distribution comes also with a \<^emph>\<open>previewer facility\<close> integrated into \<open>Isabelle/jEdit\<close>
\<^footnote>\<open>This is part of the \<open>Isabelle_DOF-add_ons\<close> - packages available on the Isabelle-zenodo page
\<^url>\<open>https://zenodo.org/records/6810799\<close>. For technical reasons, it can not be part of the
\<^isadof>-core available in the AFP.\<close> It supports incremental continuous PDF generation which
improves usability. Currently, the granularity is restricted to entire theories which have to
be selected in a specific \<^emph>\<open>document panel\<close>. This has to be activated as follows:
\<^item> Select it under the jEdit menu \\
\<open>Isabelle \<longlongrightarrow> Plugins \<longlongrightarrow> Isabelle \<longlongrightarrow> Document Panel\<close>
\<^item> Attach the appearing popup-menu as you like with the docking-menu
(see @{figure \<open>dock_document_panel\<close>}).
The subsequent figure @{figure (unchecked) \<open>global_DOF_view\<close>} will show the document panel
on top of the editing panel.
\<close>
subsection\<open>Using the Previewer\<close>
figure*[
global_DOF_view::figure
, relative_width="95"
, file_src="''figures/ThisPaperWithPreviewer.png''"
]\<open>A Screenshot while editing this Paper in \<^dof> with Preview.\<close>
text\<open>A screenshot of the editing environment is shown in \<^figure>\<open>global_DOF_view\<close>. It supports
incremental continuous PDF generation which improves usability. Currently, the granularity
is restricted to entire theories (which have to be selected in a specific document pane).
The response times can not (yet) compete
text\<open>A screenshot of the editing environment is shown in \<^figure>\<open>global_DOF_view\<close>. When
set to the right session, the documentation panel allows for restricting the \<^LaTeX>-generation
to specific theories of the document, thus supporting incremental continuous PDF generation to
parts of the overall document thus improving usability.
The response times of the previewer can not (yet) compete
with a Word- or Overleaf editor, though, which is mostly due to the checking and
evaluation overhead (the turnaround of this section is about 30 s). However, we believe
that better parallelization and evaluation techniques will decrease this gap substantially for the