Compare commits

..

1 Commits

Author SHA1 Message Date
Nicolas Méric 0ce9e6f2d0 Implement chapter*, section*, etc. latex generation in SML
ci/woodpecker/push/build Pipeline failed Details
2023-09-22 08:54:10 +02:00
110 changed files with 502 additions and 6352 deletions

View File

@ -1,6 +1,6 @@
pipeline: pipeline:
build: build:
image: git.logicalhacking.com/lh-docker/lh-docker-isabelle/isabelle_nightly:latest image: git.logicalhacking.com/lh-docker/lh-docker-isabelle/isabelle2023:latest
pull: true pull: true
commands: commands:
- hg log --limit 2 /root/isabelle - hg log --limit 2 /root/isabelle
@ -23,7 +23,7 @@ pipeline:
- cd ../.. - cd ../..
- ln -s * latest - ln -s * latest
archive: archive:
image: git.logicalhacking.com/lh-docker/lh-docker-isabelle/isabelle_nightly:latest image: git.logicalhacking.com/lh-docker/lh-docker-isabelle/isabelle2023:latest
commands: commands:
- export ARTIFACT_DIR=$CI_WORKSPACE/.artifacts/$CI_REPO/$CI_BRANCH/$CI_BUILD_NUMBER/$LATEX - export ARTIFACT_DIR=$CI_WORKSPACE/.artifacts/$CI_REPO/$CI_BRANCH/$CI_BUILD_NUMBER/$LATEX
- mkdir -p $ARTIFACT_DIR - mkdir -p $ARTIFACT_DIR

View File

@ -544,7 +544,7 @@ text*[T4::technical]\<open>
\<^enum> \<^ML>\<open>Sign.typ_match: theory -> typ * typ -> Type.tyenv -> Type.tyenv\<close> \<^enum> \<^ML>\<open>Sign.typ_match: theory -> typ * typ -> Type.tyenv -> Type.tyenv\<close>
\<^enum> \<^ML>\<open>Sign.typ_unify: theory -> typ * typ -> Type.tyenv * int -> Type.tyenv * int\<close> \<^enum> \<^ML>\<open>Sign.typ_unify: theory -> typ * typ -> Type.tyenv * int -> Type.tyenv * int\<close>
\<^enum> \<^ML>\<open>Sign.const_type: theory -> string -> typ option\<close> \<^enum> \<^ML>\<open>Sign.const_type: theory -> string -> typ option\<close>
\<^enum> \<^ML>\<open>Sign.certify_term: theory -> term -> term * typ\<close>, the function for CERTIFICATION of types \<^enum> \<^ML>\<open>Sign.certify_term: theory -> term -> term * typ * int\<close>, the function for CERTIFICATION of types
\<^enum> \<^ML>\<open>Sign.cert_term: theory -> term -> term\<close>, short-cut for the latter \<^enum> \<^ML>\<open>Sign.cert_term: theory -> term -> term\<close>, short-cut for the latter
\<^enum> \<^ML>\<open>Sign.tsig_of: theory -> Type.tsig\<close>, projects from a theory the type signature \<^enum> \<^ML>\<open>Sign.tsig_of: theory -> Type.tsig\<close>, projects from a theory the type signature
\<close> \<close>
@ -610,6 +610,8 @@ text\<open>Now we turn to the crucial issue of type-instantiation and with a giv
subsection\<open>More operations on types\<close> subsection\<open>More operations on types\<close>
text\<open> text\<open>
\<^item> \<^ML>\<open>Term_Subst.map_types_same : (typ -> typ) -> term -> term\<close>
\<^item> \<^ML>\<open>Term_Subst.map_aterms_same : (term -> term) -> term -> term\<close>
\<^item> \<^ML>\<open>Term_Subst.instantiate: typ TVars.table * term Vars.table -> term -> term\<close> \<^item> \<^ML>\<open>Term_Subst.instantiate: typ TVars.table * term Vars.table -> term -> term\<close>
\<^item> \<^ML>\<open>Term_Subst.instantiateT: typ TVars.table -> typ -> typ\<close> \<^item> \<^ML>\<open>Term_Subst.instantiateT: typ TVars.table -> typ -> typ\<close>
\<^item> \<^ML>\<open>Term_Subst.generalizeT: Names.set -> int -> typ -> typ\<close> \<^item> \<^ML>\<open>Term_Subst.generalizeT: Names.set -> int -> typ -> typ\<close>
@ -637,13 +639,16 @@ val ty' = Term_Subst.instantiateT S'' t_schematic;
(* Don't know how to build a typ TVars.table *) (* Don't know how to build a typ TVars.table *)
val t = (generalize_term @{term "[]"}); val t = (generalize_term @{term "[]"});
val t' = Term.map_types (Term_Subst.instantiateT S'') (t) val t' = Term_Subst.map_types_same (Term_Subst.instantiateT (TVars.make S')) (t)
(* or alternatively : *)
val t'' = Term.map_types (Term_Subst.instantiateT S'') (t)
\<close> \<close>
text\<open>A more abstract env for variable management in tactic proofs. A bit difficult to use text\<open>A more abstract env for variable management in tactic proofs. A bit difficult to use
outside the very closed-up tracks of conventional use...\<close> outside the very closed-up tracks of conventional use...\<close>
ML\<open> Variable.import_terms; ML\<open> Consts.the_const; (* T is a kind of signature ... *)
Variable.import_terms;
Vartab.update;\<close> Vartab.update;\<close>
subsection*[t232::technical]\<open> Type-Inference (= inferring consistent type information if possible) \<close> subsection*[t232::technical]\<open> Type-Inference (= inferring consistent type information if possible) \<close>
@ -721,7 +726,7 @@ proof - fix a :: nat
subsection*[t233::technical]\<open> Theories and the Signature API\<close> subsection*[t233::technical]\<open> Theories and the Signature API\<close>
text\<open> text\<open>
\<^enum> \<^ML>\<open>Sign.tsig_of : theory -> Type.tsig\<close> extracts the type-signature of a theory \<^enum> \<^ML>\<open>Sign.tsig_of : theory -> Type.tsig\<close> extracts the type-signature of a theory
\<^enum> \<^ML>\<open>Sign.syntax_of : theory -> Syntax.syntax\<close> extracts the constant-symbol signature \<^enum> \<^ML>\<open>Sign.syn_of : theory -> Syntax.syntax\<close> extracts the constant-symbol signature
\<^enum> \<^ML>\<open>Sign.of_sort : theory -> typ * sort -> bool\<close> decides that a type belongs to a sort. \<^enum> \<^ML>\<open>Sign.of_sort : theory -> typ * sort -> bool\<close> decides that a type belongs to a sort.
\<close> \<close>
@ -898,7 +903,7 @@ With the exception of the \<^ML>\<open>Specification.axiomatization\<close> cons
are all-together built as composition of conservative extensions. are all-together built as composition of conservative extensions.
The components are a bit scattered in the architecture. A relatively recent and The components are a bit scattered in the architecture. A relatively recent and
high-level component (more low-level components such as \<^ML>\<open>Global_Theory.add_def\<close> high-level component (more low-level components such as \<^ML>\<open>Global_Theory.add_defs\<close>
exist) for definitions and axiomatizations is here: exist) for definitions and axiomatizations is here:
\<close> \<close>
@ -1969,7 +1974,7 @@ text\<open> Note that the naming underlies the following convention.
This is encapsulated in the data structure @{ML_structure Syntax} --- This is encapsulated in the data structure @{ML_structure Syntax} ---
the table with const symbols, print and ast translations, ... The latter is accessible, e.g. the table with const symbols, print and ast translations, ... The latter is accessible, e.g.
from a Proof context via @{ML Proof_Context.syntax_of}. from a Proof context via @{ML Proof_Context.syn_of}.
\<close> \<close>
text\<open> Inner Syntax Parsing combinators for elementary Isabelle Lexems\<close> text\<open> Inner Syntax Parsing combinators for elementary Isabelle Lexems\<close>

View File

@ -1,9 +0,0 @@
template-beamerposter-UNSUPPORTED
template-beamer-UNSUPPORTED
template-lipics-v2021-UNSUPPORTED
template-lncs
template-scrartcl
template-scrreprt
template-scrreprt-modern
template-sn-article-UNSUPPORTED
template-svjour3-UNSUPPORTED

View File

@ -1,9 +0,0 @@
session "template-beamer-UNSUPPORTED" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-beamer-UNSUPPORTED"
document_files
"preamble.tex"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,72 +0,0 @@
(*<*)
theory
"template-beamer-UNSUPPORTED"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.scholarly_paper
begin
list_templates
use_template "beamer-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
(*>*)
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
(*
author*[bob, email = "\<open>bob@example.com\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
*)
text\<open>
\begin{frame}
\frametitle{Example Slide}
\centering\huge This is an example!
\end{frame}
\<close>
frame*[test_frame
, frametitle = \<open>\<open>\<open>Example Slide\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close> with items @{thm "HOL.refl"}\<close>\<close>
, framesubtitle = "''Subtitle''"]
\<open>This is an example!
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> and the term encoding the title of this frame is \<^term_>\<open>frametitle @{frame \<open>test_frame\<close>}\<close>\<close>
frame*[test_frame2
, frametitle = "''Example Slide''"
, framesubtitle = \<open>\<open>\<open>Subtitle\<^sub>t\<^sub>e\<^sub>s\<^sub>t:\<close> the value of \<^term>\<open>(3::int) + 3\<close> is @{value "(3::int) + 3"}\<close>\<close>]
\<open>Test frame env \<^term>\<open>refl\<close>\<close>
frame*[test_frame3
, options = "''allowframebreaks''"
, frametitle = "''Example Slide with frame break''"
, framesubtitle = \<open>\<open>\<open>Subtitle\<^sub>t\<^sub>e\<^sub>s\<^sub>t:\<close> the value of \<^term>\<open>(3::int) + 3\<close> is @{value "(3::int) + 3"}\<close>\<close>]
\<open>
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> and the term encoding the title of this frame is \<^term_>\<open>frametitle @{frame \<open>test_frame3\<close>}\<close>
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<close>
(*<*)
end
(*>*)

View File

@ -1,9 +0,0 @@
session "template-beamerposter-UNSUPPORTED" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-beamerposter-UNSUPPORTED"
document_files
"preamble.tex"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,21 +0,0 @@
theory
"template-beamerposter-UNSUPPORTED"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.scholarly_paper
begin
list_templates
use_template "beamerposter-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -1,10 +0,0 @@
session "template-lipics-v2021-UNSUPPORTED" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-lipics-v2021-UNSUPPORTED"
document_files
"preamble.tex"
"lipics-v2021.cls"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,21 +0,0 @@
theory
"template-lipics-v2021-UNSUPPORTED"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.scholarly_paper
begin
list_templates
use_template "lipics-v2021-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -1,9 +0,0 @@
session "template-lncs" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-lncs"
document_files
"preamble.tex"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,21 +0,0 @@
theory
"template-lncs"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.scholarly_paper
begin
list_templates
use_template "lncs"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -1,9 +0,0 @@
session "template-scrartcl" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-scrartcl"
document_files
"preamble.tex"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,21 +0,0 @@
theory
"template-scrartcl"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.scholarly_paper
begin
list_templates
use_template "scrartcl"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -1,9 +0,0 @@
session "template-scrreprt-modern" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-scrreprt-modern"
document_files
"preamble.tex"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,21 +0,0 @@
theory
"template-scrreprt-modern"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.technical_report
begin
list_templates
use_template "scrreprt-modern"
list_ontologies
use_ontology "technical_report"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -1,9 +0,0 @@
session "template-scrreprt" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-scrreprt"
document_files
"preamble.tex"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,21 +0,0 @@
theory
"template-scrreprt"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.technical_report
begin
list_templates
use_template "scrreprt"
list_ontologies
use_ontology "technical_report"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -1,10 +0,0 @@
session "template-sn-article-UNSUPPORTED" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-sn-article-UNSUPPORTED"
document_files
"preamble.tex"
"sn-jnl.cls"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,21 +0,0 @@
theory
"template-sn-article-UNSUPPORTED"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.scholarly_paper
begin
list_templates
use_template "sn-article-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -1,11 +0,0 @@
session "template-svjour3-UNSUPPORTED" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
(*theories [document = false]
A
B*)
theories
"template-svjour3-UNSUPPORTED"
document_files
"preamble.tex"
"svjour3.cls"
"svglov3.clo"

View File

@ -1 +0,0 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -1,101 +0,0 @@
% SVJour3 DOCUMENT CLASS OPTION SVGLOV3 -- for standardised journals
%
% This is an enhancement for the LaTeX
% SVJour3 document class for Springer journals
%
%%
%%
%% \CharacterTable
%% {Upper-case \A\B\C\D\E\F\G\H\I\J\K\L\M\N\O\P\Q\R\S\T\U\V\W\X\Y\Z
%% Lower-case \a\b\c\d\e\f\g\h\i\j\k\l\m\n\o\p\q\r\s\t\u\v\w\x\y\z
%% Digits \0\1\2\3\4\5\6\7\8\9
%% Exclamation \! Double quote \" Hash (number) \#
%% Dollar \$ Percent \% Ampersand \&
%% Acute accent \' Left paren \( Right paren \)
%% Asterisk \* Plus \+ Comma \,
%% Minus \- Point \. Solidus \/
%% Colon \: Semicolon \; Less than \<
%% Equals \= Greater than \> Question mark \?
%% Commercial at \@ Left bracket \[ Backslash \\
%% Right bracket \] Circumflex \^ Underscore \_
%% Grave accent \` Left brace \{ Vertical bar \|
%% Right brace \} Tilde \~}
\ProvidesFile{svglov3.clo}
[2006/02/03 v3.1
style option for standardised journals]
\typeout{SVJour Class option: svglov3.clo for standardised journals}
\def\validfor{svjour3}
\ExecuteOptions{final,10pt,runningheads}
% No size changing allowed, hence a "copy" of size10.clo is included
\renewcommand\normalsize{%
\if@twocolumn
\@setfontsize\normalsize\@xpt{12.5pt}%
\else
\if@smallext
\@setfontsize\normalsize\@xpt\@xiipt
\else
\@setfontsize\normalsize{9.5pt}{11.5pt}%
\fi
\fi
\abovedisplayskip=3 mm plus6pt minus 4pt
\belowdisplayskip=3 mm plus6pt minus 4pt
\abovedisplayshortskip=0.0 mm plus6pt
\belowdisplayshortskip=2 mm plus4pt minus 4pt
\let\@listi\@listI}
\normalsize
\newcommand\small{%
\if@twocolumn
\@setfontsize\small{8.5pt}\@xpt
\else
\if@smallext
\@setfontsize\small\@viiipt{9.5pt}%
\else
\@setfontsize\small\@viiipt{9.25pt}%
\fi
\fi
\abovedisplayskip 8.5\p@ \@plus3\p@ \@minus4\p@
\abovedisplayshortskip \z@ \@plus2\p@
\belowdisplayshortskip 4\p@ \@plus2\p@ \@minus2\p@
\def\@listi{\leftmargin\leftmargini
\parsep 0\p@ \@plus1\p@ \@minus\p@
\topsep 4\p@ \@plus2\p@ \@minus4\p@
\itemsep0\p@}%
\belowdisplayskip \abovedisplayskip
}
\let\footnotesize\small
\newcommand\scriptsize{\@setfontsize\scriptsize\@viipt\@viiipt}
\newcommand\tiny{\@setfontsize\tiny\@vpt\@vipt}
\if@twocolumn
\newcommand\large{\@setfontsize\large\@xiipt\@xivpt}
\newcommand\LARGE{\@setfontsize\LARGE{16pt}{18pt}}
\else
\newcommand\large{\@setfontsize\large\@xipt\@xiipt}
\newcommand\LARGE{\@setfontsize\LARGE{13pt}{15pt}}
\fi
\newcommand\Large{\@setfontsize\Large\@xivpt{16dd}}
\newcommand\huge{\@setfontsize\huge\@xxpt{25}}
\newcommand\Huge{\@setfontsize\Huge\@xxvpt{30}}
%
\def\runheadhook{\rlap{\smash{\lower6.5pt\hbox to\textwidth{\hrulefill}}}}
\if@twocolumn
\setlength{\textwidth}{17.4cm}
\setlength{\textheight}{234mm}
\AtEndOfClass{\setlength\columnsep{6mm}}
\else
\if@smallext
\setlength{\textwidth}{11.9cm}
\setlength{\textheight}{19.4cm}
\else
\setlength{\textwidth}{12.2cm}
\setlength{\textheight}{19.8cm}
\fi
\fi
%
\AtBeginDocument{%
\@ifundefined{@journalname}
{\typeout{Unknown journal: specify \string\journalname\string{%
<name of your journal>\string} in preambel^^J}}{}}
%
\endinput
%%
%% End of file `svglov3.clo'.

View File

@ -1,21 +0,0 @@
theory
"template-svjour3-UNSUPPORTED"
imports
"Isabelle_DOF-Ontologies.document_templates"
Isabelle_DOF.scholarly_paper
begin
list_templates
use_template "svjour3-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -1310,7 +1310,7 @@ ML
\<open> DOF_core.get_onto_class_name_global "requirement" @{theory}; \<open> DOF_core.get_onto_class_name_global "requirement" @{theory};
Syntax.parse_typ @{context} "requirement"; Syntax.parse_typ @{context} "requirement";
val Type(t,_) = Syntax.parse_typ @{context} "requirement" handle ERROR _ => dummyT; val Type(t,_) = Syntax.parse_typ @{context} "requirement" handle ERROR _ => dummyT;
Syntax.read_typ @{context} "hypothesis" handle ERROR _ => dummyT; Syntax.read_typ @{context} "hypothesis" handle _ => dummyT;
Proof_Context.init_global; \<close> Proof_Context.init_global; \<close>
end end

View File

@ -209,7 +209,8 @@ let val class_ids_so_far = ["Conceptual.A", "Conceptual.B", "Conceptual.C", "Co
"Conceptual.E", "Conceptual.F", "Conceptual.G", "Conceptual.M", "Conceptual.E", "Conceptual.F", "Conceptual.G", "Conceptual.M",
"Isa_COL.float", "Isa_COL.frame", "Isa_COL.figure", "Isa_COL.chapter", "Isa_COL.float", "Isa_COL.frame", "Isa_COL.figure", "Isa_COL.chapter",
"Isa_COL.listing", "Isa_COL.section", "Isa_COL.paragraph", "Isa_COL.listing", "Isa_COL.section", "Isa_COL.paragraph",
"Isa_COL.subsection", "Isa_COL.text_element", "Isa_COL.subsubsection"] "Isa_COL.subsection", "Isa_COL.subparagraph",
"Isa_COL.text_element", "Isa_COL.subsubsection"]
val docclass_tab = map fst (Name_Space.dest_table (DOF_core.get_onto_classes \<^context>)); val docclass_tab = map fst (Name_Space.dest_table (DOF_core.get_onto_classes \<^context>));
in @{assert} (class_ids_so_far = docclass_tab) end\<close> in @{assert} (class_ids_so_far = docclass_tab) end\<close>

View File

@ -0,0 +1,80 @@
%% Copyright (c) 2019-2022 University of Exeter
%% 2018-2022 University of Paris-Saclay
%% 2018-2019 The University of Sheffield
%%
%% License:
%% This program can be redistributed and/or modified under the terms
%% of the LaTeX Project Public License Distributed from CTAN
%% archives in directory macros/latex/base/lppl.txt; either
%% version 1.3c of the License, or (at your option) any later version.
%% OR
%% The 2-clause BSD-style license.
%%
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
%% Warning: Do Not Edit!
%% =====================
%% This is the root file for the Isabelle/DOF using the eptcs class.
%% Note that eptcs cannot be distributed as part of Isabelle/DOF; you need
%% to download eptcs.cls from http://eptcs.web.cse.unsw.edu.au/style.shtml
%% and add it manually to the praemble.tex and the ROOT file.
%%
%% All customization and/or additional packages should be added to the file
%% preamble.tex.
\documentclass[submission,copyright,creativecommons]{eptcs}
\title{No Title Given}
\usepackage{DOF-core}
\bibliographystyle{eptcs}% the mandatory bibstyle
\usepackage[numbers, sort&compress, sectionbib]{natbib}
\newcommand{\subtitle}[1]{%
\PackageError{DOF-eptcs-UNSUPPORTED}
{The LaTeX class eptcs does not support subtitles.}
{}\stop%
}%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% provide an alternative definition of
% begin: scholarly_paper.author
\newcommand{\dofeptcsinstitute}[1]{\mbox{}\\\protect\scriptsize%
\protect\begin{tabular}[t]{@{\protect\footnotesize}c@{}}%
#1%
\protect\end{tabular}%
}
\makeatletter
\newisadof{text.scholarly_paper.author}%
[label=,type=%
,scholarly_paper.author.email=%
,scholarly_paper.author.affiliation=%
,scholarly_paper.author.orcid=%
,scholarly_paper.author.http_site=%
][1]{%
\protected@write\@auxout{}{%
\string\protect\string\addauthor{%
#1 %
\string\mbox{}\string\dofeptcsinstitute{\commandkey{scholarly_paper.author.affiliation}} %
\string\mbox{}\string\email{\commandkey{scholarly_paper.author.email}} %
}
}
}
\makeatother
% end: scholarly_paper.author
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\begin{document}
\maketitle
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
% optional bibliography
\IfFileExists{root.bib}{%
\bibliography{root}
}{}
\end{document}
%%% Local Variables:
%%% mode: latex
%%% TeX-master: t
%%% End:

View File

@ -23,8 +23,6 @@
%% All customization and/or additional packages should be added to the file %% All customization and/or additional packages should be added to the file
%% preamble.tex. %% preamble.tex.
\PassOptionsToPackage{english,USenglish}{babel}
\documentclass[a4paper,UKenglish,cleveref, autoref,thm-restate]{lipics-v2021} \documentclass[a4paper,UKenglish,cleveref, autoref,thm-restate]{lipics-v2021}
\bibliographystyle{plainurl}% the mandatory bibstyle \bibliographystyle{plainurl}% the mandatory bibstyle
\title{No Title Given} \title{No Title Given}
@ -58,7 +56,7 @@
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\renewcommand{\DOFauthor}{} \renewcommand{\DOFauthor}{}
\renewcommand{\DOFinstitute}{} \renewcommand{\DOFinstitute}{}
%\expandafter\newcommand\csname 2authand\endcsname{} \expandafter\newcommand\csname 2authand\endcsname{}
\expandafter\newcommand\csname 3authand\endcsname{} \expandafter\newcommand\csname 3authand\endcsname{}
\expandafter\newcommand\csname 4authand\endcsname{} \expandafter\newcommand\csname 4authand\endcsname{}

View File

@ -26,7 +26,6 @@
\bibliographystyle{sn-basic} \bibliographystyle{sn-basic}
\let\proof\relax \let\proof\relax
\let\endproof\relax \let\endproof\relax
\newcommand{\inst}[1]{}%
\newcommand{\institute}[1]{} \newcommand{\institute}[1]{}
\usepackage{manyfoot} \usepackage{manyfoot}
\usepackage{DOF-core} \usepackage{DOF-core}

View File

@ -27,7 +27,7 @@
\usepackage{DOF-core} \usepackage{DOF-core}
\usepackage{mathptmx} \usepackage{mathptmx}
\bibliographystyle{abbrvnat} \bibliographystyle{abbrvnat}
\newcommand{\inst}[1]{}%
\usepackage[numbers, sort&compress, sectionbib]{natbib} \usepackage[numbers, sort&compress, sectionbib]{natbib}
\usepackage{hyperref} \usepackage{hyperref}
@ -40,7 +40,7 @@
,plainpages=false ,plainpages=false
} % more detailed digital TOC (aka bookmarks) } % more detailed digital TOC (aka bookmarks)
\sloppy \sloppy
%\allowdisplaybreaks[4] \allowdisplaybreaks[4]
\begin{document} \begin{document}
\selectlanguage{USenglish}% \selectlanguage{USenglish}%

View File

@ -17,6 +17,8 @@ imports
"Isabelle_DOF.Isa_DOF" "Isabelle_DOF.Isa_DOF"
begin begin
define_template "./document-templates/root-eptcs-UNSUPPORTED.tex"
"Unsupported template for the EPTCS class. Not for general use."
define_template "./document-templates/root-lipics-v2021-UNSUPPORTED.tex" define_template "./document-templates/root-lipics-v2021-UNSUPPORTED.tex"
"Unsupported template for LIPICS (v2021). Not for general use." "Unsupported template for LIPICS (v2021). Not for general use."
define_template "./document-templates/root-svjour3-UNSUPPORTED.tex" define_template "./document-templates/root-svjour3-UNSUPPORTED.tex"

View File

@ -241,40 +241,4 @@ declare[[invariants_checking_with_tactics = false]]
declare[[invariants_strict_checking = false]] declare[[invariants_strict_checking = false]]
text\<open>Invariants can have term anti-quotations\<close>
doc_class invA =
a :: int
text*[invA_inst::invA, a = 3]\<open>\<close>
doc_class invB = invA +
b :: int
invariant a_pos :: "a \<sigma> \<ge> 0"
text*[invB_inst::invB, a = 3]\<open>\<close>
doc_class invC =
c :: invB
invariant a_invB_pos :: "a (c \<sigma>) \<ge> a @{invA \<open>invA_inst\<close>}"
text*[invC_inst::invC, c = "@{invB \<open>invB_inst\<close>}"]\<open>\<close>
text\<open>Bug:
With the polymorphic class implementation, invariants type inference is to permissive:
\<close>
doc_class invA' =
a :: int
doc_class invB' = invA' +
b :: int
invariant a_pos :: "a \<sigma> \<ge> 0"
doc_class ('a, 'b) invC' =
c :: invB'
d :: "'a list"
e :: "'b list"
invariant a_pos :: "a \<sigma> \<ge> 0"
text\<open>The \<^const>\<open>a\<close> selector in the \<^const>\<open>a_pos_inv\<close> invariant of the class \<^doc_class>\<open>('a, 'b) invC'\<close>
should be rejected as the class does not have nor inherit an \<^const>\<open>a\<close> attribute
\<close>
end end

View File

@ -37,14 +37,13 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark
xstring_opt:(xstring * Position.T) option), xstring_opt:(xstring * Position.T) option),
toks_list:Input.source list) toks_list:Input.source list)
: theory -> theory = : theory -> theory =
let val toplvl = Toplevel.make_state o SOME let val ((binding,cid_pos), doc_attrs) = meta_args
val ((binding,cid_pos), doc_attrs) = meta_args
val oid = Binding.name_of binding val oid = Binding.name_of binding
val oid' = if meta_args = ODL_Meta_Args_Parser.empty_meta_args val oid' = if meta_args = ODL_Meta_Args_Parser.empty_meta_args
then "output" then "output"
else oid else oid
(* as side-effect, generates markup *) (* as side-effect, generates markup *)
fun check_n_tex_text thy toks = let val ctxt = Toplevel.presentation_context (toplvl thy) fun check_n_tex_text thy toks = let val ctxt = Toplevel.presentation_context (Toplevel.make_state (SOME thy))
val pos = Input.pos_of toks; val pos = Input.pos_of toks;
val _ = Context_Position.reports ctxt val _ = Context_Position.reports ctxt
[(pos, Markup.language_document (Input.is_delimited toks)), [(pos, Markup.language_document (Input.is_delimited toks)),

View File

@ -693,27 +693,4 @@ lemma*[e5::E] testtest : "xx + testtest_level = yy + testtest_level \<Longrighta
text\<open>Indeed this fails:\<close> text\<open>Indeed this fails:\<close>
(*lemma*[e6::E] testtest : "xx + the (level @{test2 \<open>testtest2''\<close>}) = yy + the (level @{test2 \<open>testtest2''\<close>}) \<Longrightarrow> xx = yy" by simp*) (*lemma*[e6::E] testtest : "xx + the (level @{test2 \<open>testtest2''\<close>}) = yy + the (level @{test2 \<open>testtest2''\<close>}) \<Longrightarrow> xx = yy" by simp*)
text\<open>Bug: Checking of \<^theory_text>\<open>text*\<close> type against \<^theory_text>\<open>declare_reference*\<close> is not done.
Should be compatible with type unification mechanism. See the record package\<close>
doc_class 'a AAA_test =
aaa::"'a list"
doc_class 'a BBB_test =
bbb::"'a list"
declare_reference*[aaa_test::"'a::one AAA_test"]
text\<open>@{AAA_test (unchecked) \<open>aaa_test\<close>}\<close>
text\<open>\<open>aaa_test\<close> should fail and trigger an error because its type was \<^typ>\<open>'a::one AAA_test\<close>
in the \<^theory_text>\<open>declare_reference*\<close> command:\<close>
(*text*[aaa_test::"'a::one BBB_test"]\<open>\<close>*)
text*[aaa_test::"int AAA_test"]\<open>\<close>
text\<open>\<open>aaa_test'\<close> should fail and trigger an error because its type \<^typ>\<open>string AAA_test\<close>
is not compatible with its type \<^typ>\<open>'a::one AAA_test\<close> declared in
the \<^theory_text>\<open>declare_reference*\<close> command:\<close>
text*[aaa_test'::"string AAA_test"]\<open>\<close>
end end

View File

@ -1,7 +1,7 @@
chapter AFP chapter AFP
session "Isabelle_DOF" (AFP) = "Functional-Automata" + session "Isabelle_DOF" (AFP) = "Functional-Automata" +
options [document_build = dof, timeout = 300] options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
sessions sessions
"Regular-Sets" "Regular-Sets"
directories directories
@ -22,10 +22,8 @@ session "Isabelle_DOF" (AFP) = "Functional-Automata" +
"thys/manual/M_01_Introduction" "thys/manual/M_01_Introduction"
"thys/manual/M_02_Background" "thys/manual/M_02_Background"
"thys/manual/M_03_GuidedTour" "thys/manual/M_03_GuidedTour"
"thys/manual/M_04_Document_Ontology" "thys/manual/M_04_RefMan"
"thys/manual/M_05_Proofs_Ontologies" "thys/manual/M_05_Implementation"
"thys/manual/M_06_RefMan"
"thys/manual/M_07_Implementation"
"thys/manual/Isabelle_DOF_Manual" "thys/manual/Isabelle_DOF_Manual"
document_files document_files
"root.bib" "root.bib"
@ -53,7 +51,6 @@ session "Isabelle_DOF" (AFP) = "Functional-Automata" +
"figures/doc-mod-onto-docinst.pdf" "figures/doc-mod-onto-docinst.pdf"
"figures/doc-mod-DOF.pdf" "figures/doc-mod-DOF.pdf"
"figures/doc-mod-term-aq.pdf" "figures/doc-mod-term-aq.pdf"
"figures/ThisPaperWithPreviewer.png"
export_classpath export_classpath

View File

@ -2,8 +2,6 @@
\input{M_01_Introduction.tex} \input{M_01_Introduction.tex}
\input{M_02_Background.tex} \input{M_02_Background.tex}
\input{M_03_GuidedTour.tex} \input{M_03_GuidedTour.tex}
\input{M_04_Document_Ontology.tex} \input{M_04_RefMan.tex}
\input{M_05_Proofs_Ontologies.tex} \input{M_05_Implementation.tex}
\input{M_06_RefMan.tex}
\input{M_07_Implementation.tex}
\input{Isabelle_DOF_Manual.tex} \input{Isabelle_DOF_Manual.tex}

View File

Before

Width:  |  Height:  |  Size: 10 KiB

After

Width:  |  Height:  |  Size: 10 KiB

View File

Before

Width:  |  Height:  |  Size: 17 KiB

After

Width:  |  Height:  |  Size: 17 KiB

View File

Before

Width:  |  Height:  |  Size: 13 KiB

After

Width:  |  Height:  |  Size: 13 KiB

View File

Before

Width:  |  Height:  |  Size: 35 KiB

After

Width:  |  Height:  |  Size: 35 KiB

View File

Before

Width:  |  Height:  |  Size: 18 KiB

After

Width:  |  Height:  |  Size: 18 KiB

View File

Before

Width:  |  Height:  |  Size: 162 KiB

After

Width:  |  Height:  |  Size: 162 KiB

View File

Before

Width:  |  Height:  |  Size: 214 KiB

After

Width:  |  Height:  |  Size: 214 KiB

View File

Before

Width:  |  Height:  |  Size: 70 KiB

After

Width:  |  Height:  |  Size: 70 KiB

View File

Before

Width:  |  Height:  |  Size: 196 KiB

After

Width:  |  Height:  |  Size: 196 KiB

View File

Before

Width:  |  Height:  |  Size: 203 KiB

After

Width:  |  Height:  |  Size: 203 KiB

View File

Before

Width:  |  Height:  |  Size: 383 KiB

After

Width:  |  Height:  |  Size: 383 KiB

View File

Before

Width:  |  Height:  |  Size: 36 KiB

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

View File

Before

Width:  |  Height:  |  Size: 57 KiB

After

Width:  |  Height:  |  Size: 57 KiB

View File

@ -39,8 +39,8 @@
\pagestyle{headings} \pagestyle{headings}
\uppertitleback{ \uppertitleback{
Copyright \copyright{} 2019--2024 University of Exeter, UK\\ Copyright \copyright{} 2019--2023 University of Exeter, UK\\
\phantom{Copyright \copyright{}} 2018--2024 Universit\'e Paris-Saclay, France\\ \phantom{Copyright \copyright{}} 2018--2023 Universit\'e Paris-Saclay, France\\
\phantom{Copyright \copyright{}} 2018--2019 The University of Sheffield, UK\\ \phantom{Copyright \copyright{}} 2018--2019 The University of Sheffield, UK\\
\smallskip \smallskip

View File

@ -10,39 +10,6 @@
year = 2015 year = 2015
} }
@
Book{ books/daglib/0032976,
added-at = {2014-03-12T00:00:00.000+0100},
author = {Euzenat, J{\~A}<7D>r{\~A}<7D>me and Shvaiko, Pavel},
biburl = {https://www.bibsonomy.org/bibtex/28d5372a81f181d9d5a761ca12209cf39/dblp},
interhash = {fc55a5b84d114e38db0a0303cc1bd7da},
intrahash = {8d5372a81f181d9d5a761ca12209cf39},
isbn = {978-3-642-38720-3},
keywords = {dblp},
pages = {I-XVII, 1--511},
publisher = {Springer},
timestamp = {2015-06-18T09:49:52.000+0200},
title = {Ontology Matching, Second Edition.},
year = 2013,
doi = {10.1007/978-3-642-38721-0}
}
@Misc{ atl,
title = {{ATL} -- A model transformation technology},
url = {https://www.eclipse.org/atl/},
author = {{Eclipse Foundation}},
}
@InProceedings{ BGPP95,
author = {Yamine A{\"{\i}}t Ameur and Frederic Besnard and Patrick Girard and Guy Pierra and Jean{-}Claude
Potier},
title = {Formal Specification and Metaprogramming in the {EXPRESS} Language},
booktitle = {The 7th International Conference on Software Engineering and Knowledge Engineering (SEKE)},
pages = {181--188},
publisher = {Knowledge Systems Institute},
year = 1995
}
@Misc{ ibm:doors:2019, @Misc{ ibm:doors:2019,
author = {IBM}, author = {IBM},
title = {{IBM} Engineering Requirements Management {DOORS} Family}, title = {{IBM} Engineering Requirements Management {DOORS} Family},

View File

@ -94,31 +94,31 @@
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% begin: chapter*-dispatcher % begin: chapter*-dispatcher
\NewEnviron{isamarkupchapter*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTchapter},#1]{\BODY}} %\NewEnviron{isamarkupchapter*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTchapter},#1]{\BODY}}
% end: chapter*-dispatcher % end: chapter*-dispatcher
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% begin: section*-dispatcher % begin: section*-dispatcher
\NewEnviron{isamarkupsection*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTsection},#1]{\BODY}} %\NewEnviron{isamarkupsection*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTsection},#1]{\BODY}}
% end: section*-dispatcher % end: section*-dispatcher
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% begin: subsection*-dispatcher % begin: subsection*-dispatcher
\NewEnviron{isamarkupsubsection*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTsubsection},#1]{\BODY}} %\NewEnviron{isamarkupsubsection*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTsubsection},#1]{\BODY}}
% end: subsection*-dispatcher % end: subsection*-dispatcher
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% begin: subsubsection*-dispatcher % begin: subsubsection*-dispatcher
\NewEnviron{isamarkupsubsubsection*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTsubsubsection},#1]{\BODY}} %\NewEnviron{isamarkupsubsubsection*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTsubsubsection},#1]{\BODY}}
% end: subsubsection*-dispatcher % end: subsubsection*-dispatcher
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% begin: paragraph*-dispatcher % begin: paragraph*-dispatcher
\NewEnviron{isamarkupparagraph*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTparagraph},#1]{\BODY}} %\NewEnviron{isamarkupparagraph*}[1][]{\isaDof[env={IsaUNDERSCORECOLDOTparagraph},#1]{\BODY}}
% end: paragraph*-dispatcher % end: paragraph*-dispatcher
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
@ -135,21 +135,21 @@
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% begin: chapter/section default implementations % begin: chapter/section default implementations
\newisadof{IsaUNDERSCORECOLDOTchapter}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{% %\newisadof{IsaUNDERSCORECOLDOTchapter}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{%
\isamarkupfalse\isamarkupchapter{#1}\label{\commandkey{label}}\isamarkuptrue% % \isamarkupfalse\isamarkupchapter{#1}\label{\commandkey{label}}\isamarkuptrue%
} %}
\newisadof{IsaUNDERSCORECOLDOTsection}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{% %\newisadof{IsaUNDERSCORECOLDOTsection}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{%
\isamarkupfalse\isamarkupsection{#1}\label{\commandkey{label}}\isamarkuptrue% % \isamarkupfalse\isamarkupsection{#1}\label{\commandkey{label}}\isamarkuptrue%
} %}
\newisadof{IsaUNDERSCORECOLDOTsubsection}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{% %\newisadof{IsaUNDERSCORECOLDOTsubsection}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{%
\isamarkupfalse\isamarkupsubsection{#1}\label{\commandkey{label}}\isamarkuptrue% % \isamarkupfalse\isamarkupsubsection{#1}\label{\commandkey{label}}\isamarkuptrue%
} %}
\newisadof{IsaUNDERSCORECOLDOTsubsubsection}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{% %\newisadof{IsaUNDERSCORECOLDOTsubsubsection}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{%
\isamarkupfalse\isamarkupsubsubsection{#1}\label{\commandkey{label}}\isamarkuptrue% % \isamarkupfalse\isamarkupsubsubsection{#1}\label{\commandkey{label}}\isamarkuptrue%
} %}
\newisadof{IsaUNDERSCORECOLDOTparagraph}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{% %\newisadof{IsaUNDERSCORECOLDOTparagraph}[label=,type=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=][1]{%
\isamarkupfalse\isamarkupparagraph{#1}\label{\commandkey{label}}\isamarkuptrue% % \isamarkupfalse\isamarkupparagraph{#1}\label{\commandkey{label}}\isamarkuptrue%
} %}
% end: chapter/section default implementations % end: chapter/section default implementations
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

View File

@ -31,9 +31,9 @@
\@ifclassloaded{llncs}% \@ifclassloaded{llncs}%
{}% {}%
{% {%
\RequirePackage{amsthm}
\@ifclassloaded{scrartcl}% \@ifclassloaded{scrartcl}%
{% {%
\RequirePackage{amsthm}
\newcommand{\institute}[1]{}% \newcommand{\institute}[1]{}%
\newcommand{\inst}[1]{}% \newcommand{\inst}[1]{}%
\newcommand{\orcidID}[1]{}% \newcommand{\orcidID}[1]{}%
@ -42,7 +42,6 @@
{% {%
\@ifclassloaded{lipics-v2021}% \@ifclassloaded{lipics-v2021}%
{% {%
\RequirePackage{amsthm}
\newcommand{\institute}[1]{}% \newcommand{\institute}[1]{}%
\newcommand{\inst}[1]{}% \newcommand{\inst}[1]{}%
\newcommand{\orcidID}[1]{}% \newcommand{\orcidID}[1]{}%
@ -50,14 +49,21 @@
}% }%
{% {%
{% {%
\@ifclassloaded{svjour3}% \@ifclassloaded{eptcs}%
{% {%
\newcommand{\inst}[1]{}% \newcommand{\inst}[1]{}%
\newcommand{\orcidID}[1]{}%
}% }%
{% {%
\PackageError{DOF-scholarly_paper} \@ifclassloaded{svjour3}%
{Scholarly Paper only supports LNCS or scrartcl as document class.} {%
{}\stop% \newcommand{\inst}[1]{}%
}%
{%
\PackageError{DOF-scholarly_paper}
{Scholarly Paper only supports LNCS or scrartcl as document class.}
{}\stop%
}%
}% }%
}% }%
}% }%
@ -165,58 +171,58 @@
%\newtheorem{definition}{Definition} %\newtheorem{definition}{Definition}
%\newtheorem{theorem}{Theorem} %\newtheorem{theorem}{Theorem}
\newtheorem{defn}{Definition} \newtheorem{defn}{Definition}
\providecommand{\defnautorefname}{Definition} \newcommand{\defnautorefname}{Definition}
\NewEnviron{isamarkupDefinition*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupDefinition*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{axm}{Axiom} \newtheorem{axm}{Axiom}
\providecommand{\axmautorefname}{Axiom} \newcommand{\axmautorefname}{Axiom}
\NewEnviron{isamarkupAxiom*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupAxiom*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{theom}{Theorem} \newtheorem{theom}{Theorem}
\providecommand{\theomautorefname}{Theorem} \newcommand{\theomautorefname}{Theorem}
\NewEnviron{isamarkupTheorem*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupTheorem*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{lemm}{Lemma} \newtheorem{lemm}{Lemma}
\providecommand{\lemmautorefname}{Lemma} \newcommand{\lemmautorefname}{Lemma}
\NewEnviron{isamarkupLemma*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupLemma*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{corr}{Corollary} \newtheorem{corr}{Corollary}
\providecommand{\corrautorefname}{Corollary} \newcommand{\corrautorefname}{Corollary}
\NewEnviron{isamarkupCorollary*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupCorollary*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{prpo}{Proposition} \newtheorem{prpo}{Proposition}
\providecommand{\prpoautorefname}{Proposition} \newcommand{\prpoautorefname}{Proposition}
\NewEnviron{isamarkupProposition*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupProposition*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{rulE}{Rule} \newtheorem{rulE}{Rule}
\providecommand{\rulEautorefname}{Rule} \newcommand{\rulEautorefname}{Rule}
\NewEnviron{isamarkupRule*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupRule*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{assn}{Assertion} \newtheorem{assn}{Assertion}
\providecommand{\assnautorefname}{Assertion} \newcommand{\assnautorefname}{Assertion}
\NewEnviron{isamarkupAssertion*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupAssertion*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{hypt}{Hypothesis} \newtheorem{hypt}{Hypothesis}
\providecommand{\hyptautorefname}{Hypothesis} \newcommand{\hyptautorefname}{Hypothesis}
\NewEnviron{isamarkupHypothesis*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupHypothesis*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{assm}{Assumption} \newtheorem{assm}{Assumption}
\providecommand{\assmautorefname}{Assumption} \newcommand{\assmautorefname}{Assumption}
\NewEnviron{isamarkupAssumption*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupAssumption*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{prms}{Premise} \newtheorem{prms}{Premise}
\providecommand{\prmsautorefname}{Premise} \newcommand{\prmsautorefname}{Premise}
\NewEnviron{isamarkupPremise*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupPremise*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{cons}{Consequence} \newtheorem{cons}{Consequence}
\providecommand{\consautorefname}{Consequence} \newcommand{\consautorefname}{Consequence}
\NewEnviron{isamarkupConsequence*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupConsequence*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{concUNDERSCOREstmt}{Conclusion} \newtheorem{concUNDERSCOREstmt}{Conclusion}
\providecommand{\concUNDERSCOREstmtautorefname}{Conclusion} \newcommand{\concUNDERSCOREstmtautorefname}{Conclusion}
\NewEnviron{isamarkupConclusion*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupConclusion*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{prfUNDERSCOREstmt}{Proof} \newtheorem{prfUNDERSCOREstmt}{Proof}
\providecommand{\prfUNDERSCOREstmtautorefname}{Proof} \newcommand{\prfUNDERSCOREstmtautorefname}{Proof}
\NewEnviron{isamarkupProof*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupProof*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{explUNDERSCOREstmt}{Example} \newtheorem{explUNDERSCOREstmt}{Example}
\providecommand{\explUNDERSCOREstmtautorefname}{Example} \newcommand{\explUNDERSCOREstmtautorefname}{Example}
\NewEnviron{isamarkupExample*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupExample*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{rmrk}{Remark} \newtheorem{rmrk}{Remark}
\providecommand{\rmrkautorefname}{Remark} \newcommand{\rmrkautorefname}{Remark}
\NewEnviron{isamarkupRemark*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupRemark*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{notn}{Notation} \newtheorem{notn}{Notation}
\providecommand{\notnautorefname}{Notation} \newcommand{\notnautorefname}{Notation}
\NewEnviron{isamarkupNotation*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupNotation*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newtheorem{tmgy}{Terminology} \newtheorem{tmgy}{Terminology}
\providecommand{\tmgyautorefname}{Terminology} \newcommand{\tmgyautorefname}{Terminology}
\NewEnviron{isamarkupTerminology*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}} \NewEnviron{isamarkupTerminology*}[1][]{\isaDof[env={text},#1,type={scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}]{\BODY}}
\newisadof{textDOTscholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}% \newisadof{textDOTscholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent}%

View File

@ -590,15 +590,15 @@ text \<open>underlying idea: a monitor class automatically receives a
doc_class article = doc_class article =
style_id :: string <= "''LNCS''" style_id :: string <= "''LNCS''"
version :: "(int \<times> int \<times> int)" <= "(0,0,0)" version :: "(int \<times> int \<times> int)" <= "(0,0,0)"
accepts "(title ~~ accepts "(title ~~
\<lbrakk>subtitle\<rbrakk> ~~ \<lbrakk>subtitle\<rbrakk> ~~
\<lbrace>author\<rbrace>\<^sup>+ ~~ \<lbrace>author\<rbrace>\<^sup>+ ~~
abstract ~~ abstract ~~
\<lbrace>introduction\<rbrace>\<^sup>+ ~~ \<lbrace>introduction\<rbrace>\<^sup>+ ~~
\<lbrace>background\<rbrace>\<^sup>* ~~ \<lbrace>background\<rbrace>\<^sup>* ~~
\<lbrace>technical || example || float \<rbrace>\<^sup>+ ~~ \<lbrace>technical || example \<rbrace>\<^sup>+ ~~
\<lbrace>conclusion\<rbrace>\<^sup>+ ~~ \<lbrace>conclusion\<rbrace>\<^sup>+ ~~
bibliography ~~ bibliography ~~
\<lbrace>annex\<rbrace>\<^sup>* )" \<lbrace>annex\<rbrace>\<^sup>* )"
@ -629,7 +629,7 @@ fun check ctxt cidS mon_id pos_opt =
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here}; fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid)); fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
fun check_level_hd a = case (get_level (snd a)) of fun check_level_hd a = case (get_level (snd a)) of
NONE => error("Invariant violation: leading section " ^ snd a ^ NONE => error("Invariant violation: leading section" ^ snd a ^
" must have lowest level") " must have lowest level")
| SOME X => X | SOME X => X
fun check_group_elem level_hd a = case (get_level (snd a)) of fun check_group_elem level_hd a = case (get_level (snd a)) of
@ -662,7 +662,6 @@ let val cidS = ["scholarly_paper.introduction","scholarly_paper.technical",
in DOF_core.add_ml_invariant binding (DOF_core.make_ml_invariant (body, cid_long)) thy end) in DOF_core.add_ml_invariant binding (DOF_core.make_ml_invariant (body, cid_long)) thy end)
\<close> \<close>
term\<open>float\<close>
section\<open>Miscelleous\<close> section\<open>Miscelleous\<close>
subsection\<open>Common Abbreviations\<close> subsection\<open>Common Abbreviations\<close>

View File

@ -0,0 +1,84 @@
(*************************************************************************
* Copyright (C)
* 2019 The University of Exeter
* 2018-2019 The University of Paris-Saclay
* 2018 The University of Sheffield
*
* License:
* This program can be redistributed and/or modified under the terms
* of the 2-clause BSD-style license.
*
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
section\<open>An example ontology for a scholarly paper\<close>
theory technical_report
imports "Isabelle_DOF.scholarly_paper"
begin
define_ontology "DOF-technical_report.sty" "Writing technical reports."
(* for reports paper: invariant: level \<ge> -1 *)
section\<open>More Global Text Elements for Reports\<close>
doc_class table_of_contents =
bookmark_depth :: int <= 3
depth :: int <= 3
doc_class front_matter =
front_matter_style :: string (* TODO Achim :::: *)
doc_class index =
kind :: "doc_class"
level :: "int option"
section\<open>Code Statement Elements\<close>
doc_class "code" = technical +
checked :: bool <= "False"
caption :: "string" <= "''''"
typ code
text\<open>The \<^doc_class>\<open>code\<close> is a general stub for free-form and type-checked code-fragments such as:
\<^enum> SML code
\<^enum> bash code
\<^enum> isar code (although this might be an unwanted concurrence
to the Isabelle standard cartouche)
\<^enum> C code.
It is intended that later refinements of this "stub" as done in \<^verbatim>\<open>Isabelle_C\<close> which come with their
own content checking and presentation styles.
\<close>
doc_class "SML" = code +
checked :: bool <= "False"
doc_class "ISAR" = code +
checked :: bool <= "False"
doc_class "LATEX" = code +
checked :: bool <= "False"
print_doc_class_template "SML" (* just a sample *)
doc_class report =
style_id :: string <= "''LNCS''"
version :: "(int \<times> int \<times> int)" <= "(0,0,0)"
accepts "(title ~~
\<lbrakk>subtitle\<rbrakk> ~~
\<lbrace>author\<rbrace>\<^sup>+ ~~
\<lbrakk>front_matter\<rbrakk> ~~
abstract ~~
\<lbrakk>table_of_contents\<rbrakk> ~~
\<lbrace>introduction\<rbrace>\<^sup>+ ~~
\<lbrace>technical || figure \<rbrace>\<^sup>+ ~~
\<lbrace>conclusion\<rbrace>\<^sup>+ ~~
\<lbrace>index\<rbrace>\<^sup>* ~~
bibliography)"
end

View File

@ -39,16 +39,15 @@ import isabelle._
object DOF { object DOF {
/** parameters **/ /** parameters **/
val isabelle_version = "2024" val isabelle_version = "2023"
val isabelle_url = "https://isabelle.in.tum.de/website-Isabelle2024" val isabelle_url = "https://isabelle.in.tum.de/website-Isabelle2023"
val afp_version = "afp-2024" val afp_version = "afp-2023-09-13"
// Isabelle/DOF version: "Unreleased" for development, semantic version for releases // Isabelle/DOF version: "Unreleased" for development, semantic version for releases
val version = "Unreleased" val version = "Unreleased"
val session = "Isabelle_DOF" val session = "Isabelle_DOF"
val session_ontologies = "Isabelle_DOF-Ontologies"
val latest_version = "1.3.0" val latest_version = "1.3.0"
val latest_isabelle = "Isabelle2021-1" val latest_isabelle = "Isabelle2021-1"
@ -56,7 +55,7 @@ object DOF {
val generic_doi = "10.5281/zenodo.3370482" val generic_doi = "10.5281/zenodo.3370482"
// Isabelle/DOF source repository // Isabelle/DOF source repository
val url = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/" val url = "https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/src/branch/Isabelle_dev"
// Isabelle/DOF release artifacts // Isabelle/DOF release artifacts
val artifact_dir = "releases/Isabelle_DOF/Isabelle_DOF" val artifact_dir = "releases/Isabelle_DOF/Isabelle_DOF"

View File

@ -24,9 +24,9 @@ text\<open> Building a fundamental infrastructure for common document elements s
theory Isa_COL theory Isa_COL
imports Isa_DOF imports Isa_DOF
keywords "title*" "subtitle*" keywords "title*" "subtitle*"
"chapter*" "section*" "paragraph*"
"subsection*" "subsubsection*"
"figure*" "listing*" "frame*" :: document_body "figure*" "listing*" "frame*" :: document_body
and "chapter*" "section*" "subsection*"
"subsubsection*" "paragraph*" "subparagraph*" :: document_heading
begin begin
@ -58,7 +58,8 @@ doc_class "subsubsection" = text_element +
level :: "int option" <= "Some 3" level :: "int option" <= "Some 3"
doc_class "paragraph" = text_element + doc_class "paragraph" = text_element +
level :: "int option" <= "Some 4" level :: "int option" <= "Some 4"
doc_class "subparagraph" = text_element +
level :: "int option" <= "Some 5"
subsection\<open>Ontological Macros\<close> subsection\<open>Ontological Macros\<close>
@ -128,19 +129,22 @@ fun enriched_document_cmd_exp ncid (S: (string * string) list) =
end; end;
end (* local *) end (* local *)
fun heading_command (name, pos) descr level = fun heading_command (name, pos) descr level =
Monitor_Command_Parser.document_command (name, pos) descr Monitor_Command_Parser.document_command (name, pos) descr
{markdown = false, body = true} (enriched_text_element_cmd level) [] I; {markdown = false, body = true} (enriched_text_element_cmd level) [] I;
val _ = heading_command \<^command_keyword>\<open>title*\<close> "section heading" NONE; fun heading_command' (name, pos) descr level =
val _ = heading_command \<^command_keyword>\<open>subtitle*\<close> "section heading" NONE; Monitor_Command_Parser.document_command (name, pos) descr
val _ = heading_command \<^command_keyword>\<open>chapter*\<close> "section heading" (SOME (SOME 0)); {markdown = false, body = false} (enriched_text_element_cmd level) [] I;
val _ = heading_command \<^command_keyword>\<open>section*\<close> "section heading" (SOME (SOME 1));
val _ = heading_command \<^command_keyword>\<open>subsection*\<close> "subsection heading" (SOME (SOME 2));
val _ = heading_command \<^command_keyword>\<open>subsubsection*\<close> "subsubsection heading" (SOME (SOME 3));
val _ = heading_command \<^command_keyword>\<open>paragraph*\<close> "paragraph" (SOME (SOME 4));
val _ = heading_command \<^command_keyword>\<open>title*\<close> "title heading" NONE;
val _ = heading_command \<^command_keyword>\<open>subtitle*\<close> "subtitle heading" NONE;
val _ = heading_command' \<^command_keyword>\<open>chapter*\<close> "chapter heading" (SOME (SOME 0));
val _ = heading_command' \<^command_keyword>\<open>section*\<close> "section heading" (SOME (SOME 1));
val _ = heading_command' \<^command_keyword>\<open>subsection*\<close> "subsection heading" (SOME (SOME 2));
val _ = heading_command' \<^command_keyword>\<open>subsubsection*\<close> "subsubsection heading" (SOME (SOME 3));
val _ = heading_command' \<^command_keyword>\<open>paragraph*\<close> "paragraph heading" (SOME (SOME 4));
val _ = heading_command' \<^command_keyword>\<open>subparagraph*\<close> "subparagraph heading" (SOME (SOME 5));
end end
end end

View File

@ -36,12 +36,9 @@ theory Isa_DOF (* Isabelle Document Ontology Framework *)
and "open_monitor*" "close_monitor*" and "open_monitor*" "close_monitor*"
"declare_reference*" "update_instance*" "declare_reference*" "update_instance*"
"doc_class" "onto_class" (* a syntactic alternative *) "doc_class" "onto_class" (* a syntactic alternative *)
"onto_morphism" :: thy_decl "ML*"
and "to"
and "ML*"
"define_shortcut*" "define_macro*" :: thy_decl "define_shortcut*" "define_macro*" :: thy_decl
and "definition*" :: thy_defn and "definition*" :: thy_defn
and "theorem*" "lemma*" "corollary*" "proposition*" :: thy_goal_stmt and "theorem*" "lemma*" "corollary*" "proposition*" :: thy_goal_stmt
and "schematic_goal*" :: thy_goal_stmt and "schematic_goal*" :: thy_goal_stmt
@ -77,7 +74,6 @@ val def_suffixN = "_" ^ defN
val defsN = defN ^ "s" val defsN = defN ^ "s"
val instances_of_suffixN = "_instances" val instances_of_suffixN = "_instances"
val invariant_suffixN = "_inv" val invariant_suffixN = "_inv"
val monitor_suffixN = "_monitor"
val instance_placeholderN = "\<sigma>" val instance_placeholderN = "\<sigma>"
val makeN = "make" val makeN = "make"
val schemeN = "_scheme" val schemeN = "_scheme"
@ -860,13 +856,8 @@ val check_closing_ml_invs =
check_invs get_closing_ml_invariants closing_ml_invariant_class_of closing_ml_invariant_check_of check_invs get_closing_ml_invariants closing_ml_invariant_class_of closing_ml_invariant_check_of
(* Output name for LaTeX macros. (* Output name for LaTeX macros.
Also rewrite "." to allow macros with objects long names Also rewrite "." to allow macros with objects long names *)
and "\<^sub>" allowed in name bindings and then in instance names *) val output_name = Latex.output_name o translate_string (fn "." => "DOT" | s => s)
val output_name = Symbol.explode
#> map (fn s => if equal s Symbol.sub then "SUB" else s)
#> implode
#> translate_string (fn "." => "DOT" | s => s)
#> Latex.output_name
val ISA_prefix = "Isabelle_DOF_" val ISA_prefix = "Isabelle_DOF_"
@ -1807,16 +1798,20 @@ fun check_invariants thy binding =
val cids_invariants = get_all_invariants name thy val cids_invariants = get_all_invariants name thy
val inv_and_apply_list = val inv_and_apply_list =
let fun mk_inv_and_apply cid_invs value thy = let fun mk_inv_and_apply cid_invs value thy =
let val ctxt = Proof_Context.init_global thy let val (cid_long, invs) = cid_invs
val (cid_long, invs) = cid_invs val inv_def_typ = Term.type_of value
in invs |> map in invs |> map
(fn (bind, _) => (fn (bind, _) =>
let let
val inv_name = Binding.name_of bind val inv_name = Binding.name_of bind
|> Long_Name.qualify cid_long |> Long_Name.qualify (Long_Name.base_name cid_long)
val pos = Binding.pos_of bind val pos = Binding.pos_of bind
val inv_def = inv_name |> Syntax.parse_term ctxt val inv_def = inv_name
in ((inv_name, pos), Syntax.check_term ctxt (inv_def $ value)) end) |> Syntax.read_term_global thy
in case inv_def of
Const (s, Type (st, [_ (*ty*), ty'])) =>
((inv_name, pos), Const (s, Type (st,[inv_def_typ, ty'])) $ value)
| _ => ((inv_name, pos), inv_def $ value) end)
end end
in cids_invariants |> map (fn cid_invs => mk_inv_and_apply cid_invs docitem_value thy) in cids_invariants |> map (fn cid_invs => mk_inv_and_apply cid_invs docitem_value thy)
|> flat |> flat
@ -1825,12 +1820,7 @@ fun check_invariants thy binding =
let val ctxt = Proof_Context.init_global thy let val ctxt = Proof_Context.init_global thy
val trivial_true = \<^term>\<open>True\<close> |> HOLogic.mk_Trueprop |> Thm.cterm_of ctxt |> Thm.trivial val trivial_true = \<^term>\<open>True\<close> |> HOLogic.mk_Trueprop |> Thm.cterm_of ctxt |> Thm.trivial
val evaluated_term = value ctxt term val evaluated_term = value ctxt term
handle Match => error ("exception Match raised when checking " handle ERROR e =>
^ inv_name ^ " invariant." ^ Position.here pos ^ "\n"
^ "You might want to check the definition of the invariant\n"
^ "against the value specified in the instance\n"
^ "or the default value declared in the instance class.")
| ERROR e =>
if (String.isSubstring "Wellsortedness error" e) if (String.isSubstring "Wellsortedness error" e)
andalso (Config.get_global thy DOF_core.invariants_checking_with_tactics) andalso (Config.get_global thy DOF_core.invariants_checking_with_tactics)
then (warning("Invariants checking uses proof tactics"); then (warning("Invariants checking uses proof tactics");
@ -1880,7 +1870,7 @@ fun check_invariants thy binding =
fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} binding cid_pos doc_attrs thy = fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} binding cid_pos doc_attrs thy =
let let
val oid = Binding.name_of binding val oid = Binding.name_of binding
val (((name, args_cid), typ:typ), pos') = check_classref is_monitor cid_pos thy val (((name, args_cid), typ), pos') = check_classref is_monitor cid_pos thy
val cid_pos' = (name, pos') val cid_pos' = (name, pos')
val cid_long = fst cid_pos' val cid_long = fst cid_pos'
val default_cid = args_cid = DOF_core.default_cid val default_cid = args_cid = DOF_core.default_cid
@ -1905,13 +1895,13 @@ fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} bi
val defaults_init = create_default_object thy binding cid_long typ val defaults_init = create_default_object thy binding cid_long typ
fun conv (na, _(*ty*), parsed_term) =(Binding.name_of na, Binding.pos_of na, "=", parsed_term); fun conv (na, _(*ty*), parsed_term) =(Binding.name_of na, Binding.pos_of na, "=", parsed_term);
val S = map conv (DOF_core.get_attribute_defaults cid_long thy); val S = map conv (DOF_core.get_attribute_defaults cid_long thy);
val (defaults, _) = calc_update_term {mk_elaboration=false} val (defaults, _(*ty*), _) = calc_update_term {mk_elaboration=false}
thy (name, typ) S defaults_init; thy (name, typ) S defaults_init;
val (value_term', _) = calc_update_term {mk_elaboration=true} val (value_term', _(*ty*), _) = calc_update_term {mk_elaboration=true}
thy (name, typ) assns' defaults thy (name, typ) assns' defaults
in if Config.get_global thy DOF_core.object_value_debug in if Config.get_global thy DOF_core.object_value_debug
then let then let
val (input_term, _) = calc_update_term {mk_elaboration=false} val (input_term, _(*ty*), _) = calc_update_term {mk_elaboration=false}
thy (name, typ) assns' defaults thy (name, typ) assns' defaults
in (input_term, value_term') end in (input_term, value_term') end
else (\<^term>\<open>()\<close>, value_term') end else (\<^term>\<open>()\<close>, value_term') end
@ -2262,7 +2252,7 @@ fun meta_args_2_latex thy sem_attrs transform_attr
fun ltx_of_markup ctxt s = let fun ltx_of_markup ctxt s = let
val term = (Syntax.check_term ctxt o Syntax.parse_term ctxt) s val term = (Syntax.check_term ctxt o Syntax.parse_term ctxt) s
val str_of_term = ltx_of_term ctxt true term val str_of_term = ltx_of_term ctxt true term
(* handle _ => "Exception in ltx_of_term" *) handle _ => "Exception in ltx_of_term"
in in
str_of_term str_of_term
end end
@ -2319,24 +2309,60 @@ fun gen_enriched_document_cmd' {inline} cid_transform attr_transform
(* markup reports and document output *) (* markup reports and document output *)
val headings =
["chapter",
"section",
"subsection",
"subsubsection",
"paragraph",
"subparagraph"]
val headings_star = headings |> map (suffix "*")
(* {markdown = true} sets the parsing process such that in the text-core (* {markdown = true} sets the parsing process such that in the text-core
markdown elements are accepted. *) markdown elements are accepted. *)
fun document_output {markdown: bool, markup: Latex.text -> Latex.text} sem_attrs transform_attr meta_args text ctxt = (*fun document_output body {markdown: bool, markup: Latex.text -> Latex.text} sem_attrs transform_attr meta_args text ctxt =
let let
val thy = Proof_Context.theory_of ctxt; val thy = Proof_Context.theory_of ctxt;
val _ = Context_Position.reports ctxt (Document_Output.document_reports text); val _ = Context_Position.reports ctxt (Document_Output.document_reports text);
val output_meta = meta_args_2_latex thy sem_attrs transform_attr meta_args; val output_meta = meta_args_2_latex thy sem_attrs transform_attr meta_args;
val output_text = Document_Output.output_document ctxt {markdown = markdown} text; val output_text = Document_Output.output_document ctxt {markdown = markdown} text;
in markup (output_meta @ output_text) end; in if body
then markup (output_meta @ output_text)
else markup output_text
end*)
fun document_output_reports name {markdown, body} sem_attrs transform_attr meta_args text ctxt = fun document_output_reports name {markdown, body} sem_attrs transform_attr meta_args text ctxt =
let let
fun markup xml = val thy = Proof_Context.theory_of ctxt;
let val m = if body then Markup.latex_body else Markup.latex_heading val ((binding, cid_opt), _) = meta_args
in [XML.Elem (m (Latex.output_name name), xml)] end; fun headings_markup thy name binding m xml =
val config = {markdown = markdown, markup = markup} let val label = Binding.name_of binding
in document_output config sem_attrs transform_attr meta_args text ctxt |> (fn n => DOF_core.get_instance_name_global n thy)
|> DOF_core.output_name
|> Latex.string
|> Latex.macro "label"
in [XML.Elem (m (Latex.output_name name), label @ xml)] end
val _ = Context_Position.reports ctxt (Document_Output.document_reports text);
val output_meta = meta_args_2_latex thy sem_attrs transform_attr meta_args;
val output_text = Document_Output.output_document ctxt {markdown = markdown} text;
in if body
then (case cid_opt of
NONE => let fun markup xml = [XML.Elem (Markup.latex_body name, xml)]
in markup (output_meta @ output_text) end
| SOME x => let val cid = fst x
in if headings |> map (Syntax.read_typ ctxt)
|> exists (Syntax.read_typ ctxt cid |> equal)
then headings_markup thy cid binding Markup.latex_heading output_text
else
let fun markup xml = [XML.Elem (Markup.latex_body name, xml)]
in markup (output_meta @ output_text) end
end)
else if headings_star |> exists (equal name)
then headings_markup thy (name |> translate_string (fn "*" => "" | s => s)) binding
Markup.latex_heading output_text
else headings_markup thy name binding Markup.latex_heading output_text
end; end;
@ -2363,7 +2389,6 @@ fun onto_macro_cmd_command (name, pos) descr cmd output_cmd =
)))) ))))
(* Core Command Definitions *) (* Core Command Definitions *)
val _ = val _ =
@ -2696,9 +2721,7 @@ fun gen_theorem schematic bundle_includes prep_att prep_stmt
|> prep_statement (prep_att lthy) prep_stmt elems raw_concl; |> prep_statement (prep_att lthy) prep_stmt elems raw_concl;
val atts = more_atts @ map (prep_att lthy) raw_atts; val atts = more_atts @ map (prep_att lthy) raw_atts;
val pos = Position.thread_data (); val print_cfg = {interactive = int, pos = Position.thread_data (), proof_state= false}
val print_results =
Proof_Display.print_results {interactive = int, pos = pos, proof_state = false};
fun after_qed' results goal_ctxt' = fun after_qed' results goal_ctxt' =
let let
@ -2710,13 +2733,13 @@ fun gen_theorem schematic bundle_includes prep_att prep_stmt
Local_Theory.notes_kind kind Local_Theory.notes_kind kind
(map2 (fn (b, _) => fn ths => (b, [(ths, [])])) stmt results') lthy; (map2 (fn (b, _) => fn ths => (b, [(ths, [])])) stmt results') lthy;
val lthy'' = val lthy'' =
if Binding.is_empty_atts (name, atts) if Binding.is_empty_atts (name, atts) then
then (print_results lthy' ((kind, ""), res); lthy') (Proof_Display.print_results print_cfg lthy' ((kind, ""), res); lthy')
else else
let let
val ([(res_name, _)], lthy'') = val ([(res_name, _)], lthy'') =
Local_Theory.notes_kind kind [((name, atts), [(maps #2 res, [])])] lthy'; Local_Theory.notes_kind kind [((name, atts), [(maps #2 res, [])])] lthy';
val _ = print_results lthy' ((kind, res_name), res); val _ = Proof_Display.print_results print_cfg lthy' ((kind, res_name), res);
in lthy'' end; in lthy'' end;
in after_qed results' lthy'' end; in after_qed results' lthy'' end;
@ -2917,7 +2940,6 @@ fun compute_trace_ML ctxt oid pos_opt pos' =
val parse_oid = Scan.lift(Parse.position Args.name) val parse_oid = Scan.lift(Parse.position Args.name)
val parse_cid = Scan.lift(Parse.position Args.name) val parse_cid = Scan.lift(Parse.position Args.name)
val parse_oid' = Term_Style.parse -- parse_oid val parse_oid' = Term_Style.parse -- parse_oid
val parse_oid'' = Scan.lift(Parse.embedded_position)
val parse_cid' = Term_Style.parse -- parse_cid val parse_cid' = Term_Style.parse -- parse_cid
@ -2962,6 +2984,13 @@ fun compute_cid_repr ctxt cid _ =
let val _ = DOF_core.get_onto_class_name_global cid (Proof_Context.theory_of ctxt) let val _ = DOF_core.get_onto_class_name_global cid (Proof_Context.theory_of ctxt)
in Const(cid,dummyT) end in Const(cid,dummyT) end
fun compute_name_repr ctxt oid pos =
let val instances = DOF_core.get_instances ctxt
val markup = DOF_core.get_instance_name_global oid (Proof_Context.theory_of ctxt)
|> Name_Space.markup (Name_Space.space_of_table instances)
val _ = Context_Position.report ctxt pos markup;
in HOLogic.mk_string oid end
local local
fun pretty_attr_access_style ctxt (style, ((attr,pos),(oid,pos'))) = fun pretty_attr_access_style ctxt (style, ((attr,pos),(oid,pos'))) =
@ -2971,15 +3000,8 @@ fun pretty_trace_style ctxt (style, (oid,pos)) =
Document_Output.pretty_term ctxt (style (ISA_core.compute_attr_access (Context.Proof ctxt) Document_Output.pretty_term ctxt (style (ISA_core.compute_attr_access (Context.Proof ctxt)
traceN oid NONE pos)); traceN oid NONE pos));
fun pretty_name_style ctxt (oid,pos) = fun pretty_name_style ctxt (style, (oid,pos)) =
let Document_Output.pretty_term ctxt (style (compute_name_repr ctxt oid pos))
val instances = DOF_core.get_instances ctxt
val ns = Name_Space.space_of_table instances
val name = DOF_core.get_instance_name_global oid (Proof_Context.theory_of ctxt)
val ctxt' = Config.put Name_Space.names_unique true ctxt
val _ = name |> Name_Space.markup ns
|> Context_Position.report ctxt pos
in Name_Space.pretty ctxt' ns name end
fun pretty_cid_style ctxt (style, (cid,pos)) = fun pretty_cid_style ctxt (style, (cid,pos)) =
Document_Output.pretty_term ctxt (style (compute_cid_repr ctxt cid pos)); Document_Output.pretty_term ctxt (style (compute_cid_repr ctxt cid pos));
@ -2992,7 +3014,6 @@ fun context_position_parser parse_con (ctxt, toks) =
val (res, (ctxt', toks')) = parse_con (ctxt, toks) val (res, (ctxt', toks')) = parse_con (ctxt, toks)
in ((res,pos),(ctxt', toks')) end in ((res,pos),(ctxt', toks')) end
val parse_cid0 = parse_cid
val parse_cid = (context_position_parser Args.typ_abbrev) val parse_cid = (context_position_parser Args.typ_abbrev)
>> (fn (Type(ss,_),pos) => (pos,ss) >> (fn (Type(ss,_),pos) => (pos,ss)
|( _,pos) => ISA_core.err "Undefined Class Id" pos); |( _,pos) => ISA_core.err "Undefined Class Id" pos);
@ -3004,11 +3025,6 @@ fun pretty_cid_style ctxt (_,(pos,cid)) =
(*reconversion to term in order to have access to term print options like: names_short etc...) *) (*reconversion to term in order to have access to term print options like: names_short etc...) *)
Document_Output.pretty_term ctxt ((compute_cid_repr ctxt cid pos)); Document_Output.pretty_term ctxt ((compute_cid_repr ctxt cid pos));
fun get_class_2_ML ctxt (str,_) =
let val thy = Context.theory_of ctxt
val DOF_core.Onto_Class S = (DOF_core.get_onto_class_global' str thy)
in @{make_string} S end
in in
val _ = Theory.setup val _ = Theory.setup
(ML_Antiquotation.inline \<^binding>\<open>docitem\<close> (ML_Antiquotation.inline \<^binding>\<open>docitem\<close>
@ -3019,13 +3035,11 @@ val _ = Theory.setup
(fn (ctxt,toks) => (parse_oid >> trace_attr_2_ML ctxt) (ctxt, toks)) #> (fn (ctxt,toks) => (parse_oid >> trace_attr_2_ML ctxt) (ctxt, toks)) #>
ML_Antiquotation.inline \<^binding>\<open>docitem_name\<close> ML_Antiquotation.inline \<^binding>\<open>docitem_name\<close>
(fn (ctxt,toks) => (parse_oid >> get_instance_name_2_ML ctxt) (ctxt, toks)) #> (fn (ctxt,toks) => (parse_oid >> get_instance_name_2_ML ctxt) (ctxt, toks)) #>
ML_Antiquotation.inline \<^binding>\<open>doc_class\<close>
(fn (ctxt,toks) => (parse_cid0 >> get_class_2_ML ctxt) (ctxt, toks)) #>
basic_entity \<^binding>\<open>trace_attribute\<close> parse_oid' pretty_trace_style #> basic_entity \<^binding>\<open>trace_attribute\<close> parse_oid' pretty_trace_style #>
basic_entity \<^binding>\<open>doc_class\<close> parse_cid' pretty_cid_style #> basic_entity \<^binding>\<open>doc_class\<close> parse_cid' pretty_cid_style #>
basic_entity \<^binding>\<open>onto_class\<close> parse_cid' pretty_cid_style #> basic_entity \<^binding>\<open>onto_class\<close> parse_cid' pretty_cid_style #>
basic_entity \<^binding>\<open>docitem_attribute\<close> parse_attribute_access' pretty_attr_access_style #> basic_entity \<^binding>\<open>docitem_attribute\<close> parse_attribute_access' pretty_attr_access_style #>
basic_entity \<^binding>\<open>docitem_name\<close> parse_oid'' pretty_name_style basic_entity \<^binding>\<open>docitem_name\<close> parse_oid' pretty_name_style
) )
end end
end end
@ -3076,31 +3090,49 @@ fun def_cmd (decl, spec, prems, params) lthy =
fun mk_meta_eq (t, u) = \<^Const>\<open>Pure.eq \<open>fastype_of t\<close> for t u\<close>; fun mk_meta_eq (t, u) = \<^Const>\<open>Pure.eq \<open>fastype_of t\<close> for t u\<close>;
fun define_cond bind eq (ctxt:local_theory) = fun define_cond bind f_sty read_cond (ctxt:local_theory) =
let let val eq = mk_meta_eq(Free(Binding.name_of bind, f_sty),read_cond)
val args = (SOME(bind,NONE,NoSyn), (Binding.empty_atts,eq),[],[]) val args = (SOME(bind,NONE,NoSyn), (Binding.empty_atts,eq),[],[])
in def_cmd args ctxt end in def_cmd args ctxt end
fun define_inv (bind, inv) thy = fun define_inv (params, cid_long) (bind, inv) thy =
let val ctxt = Proof_Context.init_global thy let val inv_term = Syntax.read_term (Proof_Context.init_global thy) inv
val inv_parsed_term = Syntax.parse_term ctxt inv |> DOF_core.elaborate_term' ctxt fun update_attribute_type thy class_scheme_ty cid_long
val abs_term = Term.lambda (Free (instance_placeholderN, dummyT)) inv_parsed_term (Const (s, Type (st,[ty, ty'])) $ t) =
val eq = Logic.mk_equals (Free(Binding.name_of bind, dummyT), abs_term) let
|> Syntax.check_term (Proof_Context.init_global thy) val cid = Long_Name.qualifier s
in thy |> Named_Target.theory_map (define_cond bind eq) end in case Name_Space.lookup
(DOF_core.get_onto_classes (Proof_Context.init_global thy)) cid of
fun define_monitor_const doc_class_name bind thy = NONE => Const (s, Type(st,[ty, ty']))
let val bname = Long_Name.base_name doc_class_name $ (update_attribute_type thy class_scheme_ty cid_long t)
val rex = DOF_core.rex_of doc_class_name thy | SOME _ => if DOF_core.is_subclass_global thy cid_long cid
val monitor_binding = bind |> (Binding.qualify false bname then let val Type(st', tys') = ty
#> Binding.suffix_name monitor_suffixN) in if tys' = [\<^typ>\<open>unit\<close>]
in then Const (s, Type(st,[ty, ty']))
if can hd rex $ (update_attribute_type thy class_scheme_ty cid_long t)
then else Const(s, Type(st,[class_scheme_ty, ty']))
let val eq = Logic.mk_equals (Free(Binding.name_of monitor_binding, doc_class_rexp_T), hd rex) $ (update_attribute_type thy class_scheme_ty cid_long t)
in thy |> Named_Target.theory_map (define_cond monitor_binding eq) end end
else thy else Const (s, Type(st,[ty, ty']))
end $ (update_attribute_type thy class_scheme_ty cid_long t)
end
| update_attribute_type thy class_scheme_ty cid_long (t $ t') =
(update_attribute_type thy class_scheme_ty cid_long t)
$ (update_attribute_type thy class_scheme_ty cid_long t')
| update_attribute_type thy class_scheme_ty cid_long (Abs(s, ty, t)) =
Abs(s, ty, update_attribute_type thy class_scheme_ty cid_long t)
| update_attribute_type _ class_scheme_ty _ (Free(s, ty)) = if s = instance_placeholderN
then Free (s, class_scheme_ty)
else Free (s, ty)
| update_attribute_type _ _ _ t = t
val zeta = (singleton (Name.variant_list (map #1 params)) "'z", \<^sort>\<open>type\<close>)
val typ = Type (cid_long ^ schemeN, map TFree (params @ [zeta]))
(* Update the type of each attribute update function to match the type of the
current class. *)
val inv_term' = update_attribute_type thy typ cid_long inv_term
val eq_inv_ty = typ --> HOLogic.boolT
val abs_term = Term.lambda (Free (instance_placeholderN, typ)) inv_term'
in thy |> Named_Target.theory_map (define_cond bind eq_inv_ty abs_term) end
fun add_doc_class_cmd overloaded (raw_params, binding) fun add_doc_class_cmd overloaded (raw_params, binding)
raw_parent raw_fieldsNdefaults reject_Atoms regexps invariants thy = raw_parent raw_fieldsNdefaults reject_Atoms regexps invariants thy =
@ -3171,10 +3203,9 @@ fun add_doc_class_cmd overloaded (raw_params, binding)
| SOME _ => if (not o null) record_fields | SOME _ => if (not o null) record_fields
then add record_fields invariants' {virtual=false} then add record_fields invariants' {virtual=false}
else add [DOF_core.tag_attr] invariants' {virtual=true}) else add [DOF_core.tag_attr] invariants' {virtual=true})
(* defines the ontology-checked text antiquotation to this document class *)
|> (fn thy => OntoLinkParser.docitem_antiquotation binding (cid thy) thy) |> (fn thy => OntoLinkParser.docitem_antiquotation binding (cid thy) thy)
|> (fn thy => fold define_inv (invariants') thy) (* defines the ontology-checked text antiquotation to this document class *)
|> (fn thy => define_monitor_const (cid thy) binding thy) |> (fn thy => fold(define_inv (params', (cid thy))) (invariants') thy)
(* The function declare_ISA_class_accessor_and_check_instance uses a prefix (* The function declare_ISA_class_accessor_and_check_instance uses a prefix
because the class name is already bound to "doc_class Regular_Exp.rexp" constant because the class name is already bound to "doc_class Regular_Exp.rexp" constant
by add_doc_class_cmd function *) by add_doc_class_cmd function *)
@ -3214,76 +3245,6 @@ val _ =
val clean_mixfix_sub = translate_string
(fn "\<^sub>_" => "_"
| "\<^sub>'" => "'"
| c => c);
val prefix_sub = prefix "\<^sub>"
val convertN = "convert"
fun add_onto_morphism classes_mappings eqs thy =
if (length classes_mappings = length eqs) then
let
val specs = map (fn x => (Binding.empty_atts, x)) eqs
val converts =
map (fn (oclasses, dclass) =>
let
val oclasses_string = map YXML.content_of oclasses
val dclass_string = YXML.content_of dclass
val const_sub_name = dclass_string
|> (oclasses_string |> fold_rev (fn x => fn y => x ^ "_" ^ y))
|> String.explode |> map (fn x => "\<^sub>" ^ (String.str x)) |> String.concat
val convert_typ = oclasses_string |> rev |> hd
|> (oclasses_string |> rev |> tl |> fold (fn x => fn y => x ^ " \<times> " ^ y))
val convert_typ' = convert_typ ^ " \<Rightarrow> " ^ dclass_string
val oclasses_sub_string = oclasses_string
|> map (clean_mixfix_sub
#> String.explode
#> map (prefix_sub o String.str)
#> String.concat)
val mixfix = oclasses_sub_string |> rev |> hd
|> (oclasses_sub_string |> rev |> tl |> fold (fn x => fn y => x ^ "\<^sub>\<times>" ^ y))
|> ISA_core.clean_mixfix
val mixfix' = convertN ^ mixfix ^ "\<^sub>\<Rightarrow>"
^ (dclass_string |> clean_mixfix_sub |> String.explode
|> map (prefix_sub o String.str) |> String.concat)
in SOME (Binding.name (convertN ^ const_sub_name), SOME convert_typ', Mixfix.mixfix mixfix') end)
classes_mappings
val args = map (fn (x, y) => (x, y, [], [])) (converts ~~ specs)
val lthy = Named_Target.theory_init thy
val updated_lthy = fold (fn (decl, spec, prems, params) => fn lthy =>
let
val (_, lthy') = Specification.definition_cmd decl params prems spec true lthy
in lthy' end) args lthy
in Local_Theory.exit_global updated_lthy end
(* alternative way to update the theory using the Theory.join_theory function *)
(*val lthys = map (fn (decl, spec, prems, params) =>
let
val (_, lthy') = Specification.definition_cmd decl params prems spec true lthy
in lthy' end) args
val thys = map (Local_Theory.exit_global) lthys
in Theory.join_theory thys end*)
else error("The number of morphisms declarations does not match the number of definitions")
fun add_onto_morphism' (classes_mappings, eqs) = add_onto_morphism classes_mappings eqs
val parse_onto_morphism = Parse.and_list
((Parse.$$$ "(" |-- Parse.enum1 "," Parse.typ --| Parse.$$$ ")" --| \<^keyword>\<open>to\<close>)
-- Parse.typ)
-- (\<^keyword>\<open>where\<close> |-- Parse.and_list Parse.prop)
(* The name of the definitions must follow this rule:
for the declaration "onto_morphism (AA, BB) to CC",
the name of the constant must be "convert\<^sub>A\<^sub>A\<^sub>\<times>\<^sub>B\<^sub>B\<^sub>\<Rightarrow>\<^sub>C\<^sub>C". *)
val _ =
Outer_Syntax.command \<^command_keyword>\<open>onto_morphism\<close> "define ontology morpism"
(parse_onto_morphism >> (Toplevel.theory o add_onto_morphism'));
end (* struct *) end (* struct *)
\<close> \<close>

View File

@ -52,6 +52,12 @@ definition rep1 :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrace>(_)\<rbrace>\<^sup
definition opt :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrakk>(_)\<rbrakk>") definition opt :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrakk>(_)\<rbrakk>")
where "\<lbrakk>A\<rbrakk> \<equiv> A || One" where "\<lbrakk>A\<rbrakk> \<equiv> A || One"
find_consts name:"RegExpI*"
ML\<open>
val t = Sign.syn_of \<^theory>
\<close>
print_syntax
value "Star (Conc(Alt (Atom(CHR ''a'')) (Atom(CHR ''b''))) (Atom(CHR ''c'')))" value "Star (Conc(Alt (Atom(CHR ''a'')) (Atom(CHR ''b''))) (Atom(CHR ''c'')))"
text\<open>or better equivalently:\<close> text\<open>or better equivalently:\<close>
value "\<lbrace>(\<lfloor>CHR ''a''\<rfloor> || \<lfloor>CHR ''b''\<rfloor>) ~~ \<lfloor>CHR ''c''\<rfloor>\<rbrace>\<^sup>*" value "\<lbrace>(\<lfloor>CHR ''a''\<rfloor> || \<lfloor>CHR ''b''\<rfloor>) ~~ \<lfloor>CHR ''c''\<rfloor>\<rbrace>\<^sup>*"
@ -236,72 +242,6 @@ end; (* local *)
end (* struct *) end (* struct *)
\<close> \<close>
lemma regexp_sub : "a \<le> b \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (\<lfloor>a\<rfloor>) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (\<lfloor>b\<rfloor>)"
using dual_order.trans by auto
lemma regexp_seq_mono:
"Lang(a) \<subseteq> Lang (a') \<Longrightarrow> Lang(b) \<subseteq> Lang (b') \<Longrightarrow> Lang(a ~~ b) \<subseteq> Lang(a' ~~ b')" by auto
lemma regexp_seq_mono':
"L\<^sub>s\<^sub>u\<^sub>b(a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (a') \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b(b) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (b') \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b(a ~~ b) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b(a' ~~ b')" by auto
lemma regexp_alt_mono :"Lang(a) \<subseteq> Lang (a') \<Longrightarrow> Lang(a || b) \<subseteq> Lang(a' || b)" by auto
lemma regexp_alt_mono' :"L\<^sub>s\<^sub>u\<^sub>b(a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (a') \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b(a || b) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b(a' || b)" by auto
lemma regexp_alt_commute : "Lang(a || b) = Lang(b || a)" by auto
lemma regexp_alt_commute' : "L\<^sub>s\<^sub>u\<^sub>b(a || b) = L\<^sub>s\<^sub>u\<^sub>b(b || a)" by auto
lemma regexp_unit_right : "Lang (a) = Lang (a ~~ One) " by simp
lemma regexp_unit_right' : "L\<^sub>s\<^sub>u\<^sub>b (a) = L\<^sub>s\<^sub>u\<^sub>b (a ~~ One) " by simp
lemma regexp_unit_left : "Lang (a) = Lang (One ~~ a) " by simp
lemma regexp_unit_left' : "L\<^sub>s\<^sub>u\<^sub>b (a) = L\<^sub>s\<^sub>u\<^sub>b (One ~~ a) " by simp
lemma opt_star_incl :"Lang (opt a) \<subseteq> Lang (Star a)" by (simp add: opt_def subset_iff)
lemma opt_star_incl':"L\<^sub>s\<^sub>u\<^sub>b (opt a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star a)" by (simp add: opt_def subset_iff)
lemma rep1_star_incl:"Lang (rep1 a) \<subseteq> Lang (Star a)"
unfolding rep1_def by(subst L_Star, subst L_Conc)(force)
lemma rep1_star_incl':"L\<^sub>s\<^sub>u\<^sub>b (rep1 a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star a)"
unfolding rep1_def by(subst L\<^sub>s\<^sub>u\<^sub>b_Star, subst L\<^sub>s\<^sub>u\<^sub>b_Conc)(force)
lemma cancel_rep1 : "Lang (a) \<subseteq> Lang (rep1 a)"
unfolding rep1_def by auto
lemma cancel_rep1' : "L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (rep1 a)"
unfolding rep1_def by auto
lemma seq_cancel_opt : "Lang (a) \<subseteq> Lang (c) \<Longrightarrow> Lang (a) \<subseteq> Lang (opt b ~~ c)"
by(subst regexp_unit_left, rule regexp_seq_mono)(simp_all add: opt_def)
lemma seq_cancel_opt' : "L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (c) \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (opt b ~~ c)"
by(subst regexp_unit_left', rule regexp_seq_mono')(simp_all add: opt_def)
lemma seq_cancel_Star : "Lang (a) \<subseteq> Lang (c) \<Longrightarrow> Lang (a) \<subseteq> Lang (Star b ~~ c)"
by auto
lemma seq_cancel_Star' : "L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (c) \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star b ~~ c)"
by auto
lemma mono_Star : "Lang (a) \<subseteq> Lang (b) \<Longrightarrow> Lang (Star a) \<subseteq> Lang (Star b)"
by(auto)(metis in_star_iff_concat order.trans)
lemma mono_Star' : "L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (b) \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (Star a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star b)"
by(auto)(metis in_star_iff_concat order.trans)
lemma mono_rep1_star:"Lang (a) \<subseteq> Lang (b) \<Longrightarrow> Lang (rep1 a) \<subseteq> Lang (Star b)"
using mono_Star rep1_star_incl by blast
lemma mono_rep1_star':"L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (b) \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (rep1 a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star b)"
using mono_Star' rep1_star_incl' by blast
no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100) no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
no_notation Plus (infixr "||" 55) no_notation Plus (infixr "||" 55)
no_notation Times (infixr "~~" 60) no_notation Times (infixr "~~" 60)

View File

@ -13,7 +13,7 @@
(*<*) (*<*)
theory "Isabelle_DOF_Manual" theory "Isabelle_DOF_Manual"
imports "M_07_Implementation" imports "M_05_Implementation"
begin begin
close_monitor*[this] close_monitor*[this]
check_doc_global check_doc_global

View File

@ -124,7 +124,7 @@ abstract*[abs, keywordlist="[\<open>Ontology\<close>, \<open>Ontological Modelin
A major application of \<^dof> is the integrated development of A major application of \<^dof> is the integrated development of
formal certification documents (\<^eg>, for Common Criteria or CENELEC formal certification documents (\<^eg>, for Common Criteria or CENELEC
50128) that require consistency across both formal and informal 50128) that require consistency across both formal and informal
arguments. arguments.
\<^isadof> is integrated into Isabelle's IDE, which \<^isadof> is integrated into Isabelle's IDE, which
allows for smooth ontology development as well as immediate allows for smooth ontology development as well as immediate

View File

@ -24,17 +24,17 @@ The linking of the \<^emph>\<open>formal\<close> to the \<^emph>\<open>informal\
digitization of knowledge and its propagation. This challenge incites numerous research efforts digitization of knowledge and its propagation. This challenge incites numerous research efforts
summarized under the labels ``semantic web,'' ``data mining,'' or any form of advanced ``semantic'' summarized under the labels ``semantic web,'' ``data mining,'' or any form of advanced ``semantic''
text processing. A key role in structuring this linking plays is \<^emph>\<open>document ontologies\<close> (also called text processing. A key role in structuring this linking plays is \<^emph>\<open>document ontologies\<close> (also called
\<^emph>\<open>vocabulary\<close> in the semantic web community~\<^cite>\<open>"w3c:ontologies:2015"\<close>), \<^ie>, a machine-readable \<^emph>\<open>vocabulary\<close> in the semantic web community~@{cite "w3c:ontologies:2015"}), \<^ie>, a machine-readable
form of the structure of documents as well as the document discourse. form of the structure of documents as well as the document discourse.
Such ontologies can be used for the scientific discourse within scholarly articles, mathematical Such ontologies can be used for the scientific discourse within scholarly articles, mathematical
libraries, and in the engineering discourse of standardized software certification libraries, and in the engineering discourse of standardized software certification
documents~\<^cite>\<open>"boulanger:cenelec-50128:2015" and "cc:cc-part3:2006"\<close>. All these documents~@{cite "boulanger:cenelec-50128:2015" and "cc:cc-part3:2006"}. All these
documents contain formal content and have to follow a given structure. In practice, large groups of developers have to produce a substantial documents contain formal content and have to follow a given structure. In practice, large groups of developers have to produce a substantial
set of documents where consistency is notoriously difficult to maintain. In particular, set of documents where consistency is notoriously difficult to maintain. In particular,
certifications are centred around the \<^emph>\<open>traceability\<close> of requirements throughout the entire certifications are centred around the \<^emph>\<open>traceability\<close> of requirements throughout the entire
set of documents. While technical solutions for the traceability problem exist (most notably: set of documents. While technical solutions for the traceability problem exist (most notably:
DOORS~\<^cite>\<open>"ibm:doors:2019"\<close>), they are weak in the treatment of formal entities (such as formulas DOORS~@{cite "ibm:doors:2019"}), they are weak in the treatment of formal entities (such as formulas
and their logical contexts). and their logical contexts).
Further applications are the domain-specific discourse in juridical texts or medical reports. Further applications are the domain-specific discourse in juridical texts or medical reports.
@ -53,7 +53,7 @@ the Document Ontology Framework (\<^dof>) and an implementation of \<^dof> calle
provers. \<^isadof> is an instance of this novel framework, implemented as an extension of Isabelle/HOL, provers. \<^isadof> is an instance of this novel framework, implemented as an extension of Isabelle/HOL,
to \<^emph>\<open>model\<close> typed ontologies and to \<^emph>\<open>enforce\<close> them during document evolution. Based on Isabelle's to \<^emph>\<open>model\<close> typed ontologies and to \<^emph>\<open>enforce\<close> them during document evolution. Based on Isabelle's
infrastructures, ontologies may refer to types, terms, proven theorems, code, or established infrastructures, ontologies may refer to types, terms, proven theorems, code, or established
assertions. Based on a novel adaption of the Isabelle IDE (called PIDE, \<^cite>\<open>"wenzel:asynchronous:2014"\<close>), assertions. Based on a novel adaption of the Isabelle IDE (called PIDE, @{cite "wenzel:asynchronous:2014"}),
a document is checked to be \<^emph>\<open>conform\<close> to a particular ontology---\<^isadof> is designed to give fast a document is checked to be \<^emph>\<open>conform\<close> to a particular ontology---\<^isadof> is designed to give fast
user-feedback \<^emph>\<open>during the capture of content\<close>. This is particularly valuable in the case of document user-feedback \<^emph>\<open>during the capture of content\<close>. This is particularly valuable in the case of document
evolution, where the \<^emph>\<open>coherence\<close> between the formal and the informal parts of the content can evolution, where the \<^emph>\<open>coherence\<close> between the formal and the informal parts of the content can
@ -63,9 +63,9 @@ To avoid any misunderstanding: \<^isadof> is \<^emph>\<open>not a theory in HOL
track and trace links in texts. It is an \<^emph>\<open>environment to write structured text\<close> which track and trace links in texts. It is an \<^emph>\<open>environment to write structured text\<close> which
\<^emph>\<open>may contain\<close> Isabelle/HOL definitions and proofs like mathematical articles, tech-reports and \<^emph>\<open>may contain\<close> Isabelle/HOL definitions and proofs like mathematical articles, tech-reports and
scientific papers---as the present one, which is written in \<^isadof> itself. \<^isadof> is a plugin scientific papers---as the present one, which is written in \<^isadof> itself. \<^isadof> is a plugin
into the Isabelle/Isar framework in the style of~\<^cite>\<open>"wenzel.ea:building:2007"\<close>. However, into the Isabelle/Isar framework in the style of~@{cite "wenzel.ea:building:2007"}. However,
\<^isadof> will generate from ontologies a theory infrastructure consisting of types, terms, theorems \<^isadof> will generate from ontologies a theory infrastructure consisting of types, terms, theorems
and code that allows both interactive checking and formal reasoning over meta-data and code that allows both interactive checking as well as formal reasoning over meta-data
related to annotated documents.\<close> related to annotated documents.\<close>
subsubsection\<open>How to Read This Manual\<close> subsubsection\<open>How to Read This Manual\<close>
@ -113,26 +113,26 @@ CHANGELOG.md CITATION examples install LICENSE README.md ROOTS src\<close
subsubsection\<open>How to Cite \<^isadof>\<close> subsubsection\<open>How to Cite \<^isadof>\<close>
text\<open> text\<open>
If you use or extend \<^isadof> in your publications, please use If you use or extend \<^isadof> in your publications, please use
\<^item> for the \<^isadof> system~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>: \<^item> for the \<^isadof> system~@{cite "brucker.ea:isabelle-ontologies:2018"}:
\begin{quote}\small \begin{quote}\small
A.~D. Brucker, I.~Ait-Sadoune, N. Méric, and B.~Wolff. Using Deep Ontologies in Formal A.~D. Brucker, I.~Ait-Sadoune, N. Méric, and B.~Wolff. Using Deep Ontologies in Formal
Software Engineering. In \<^emph>\<open>International Conference on Rigorous State-Based Methods (ABZ 2023)\<close>, Software Engineering. In \<^emph>\<open>International Conference on Rigorous State-Based Methods (ABZ 2023)\<close>,
To appear in Lecture Notes in Computer Science. Springer-Verlag, To appear in Lecture Notes in Computer Science. Springer-Verlag,
Heidelberg, 2023. \href{10.1007/978-3-031-33163-3_2} {10.1007/978-3-031-33163-3\_2}. Heidelberg, 2023. \href{https://doi.org/???} {10.1007/????????}.
\end{quote} \end{quote}
A \<^BibTeX>-entry is available at: A \<^BibTeX>-entry is available at:
\<^url>\<open>https://www.lri.fr/~wolff/bibtex/wolff.html\<close>. \<^url>\<open>https://www.lri.fr/~wolff/bibtex/wolff.html\<close>.
\<^item> an older description of the system~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>: \<^item> an older description of the system~@{cite "brucker.ea:isabelle-ontologies:2018"}:
\begin{quote}\small \begin{quote}\small
A.~D. Brucker, I.~Ait-Sadoune, P.~Crisafulli, and B.~Wolff. Using the {Isabelle} ontology A.~D. Brucker, I.~Ait-Sadoune, P.~Crisafulli, and B.~Wolff. Using the {Isabelle} ontology
framework: Linking the formal with the informal. In \<^emph>\<open>Conference on Intelligent Computer framework: Linking the formal with the informal. In \<^emph>\<open>Conference on Intelligent Computer
Mathematics (CICM)\<close>, number 11006 in Lecture Notes in Computer Science. Springer-Verlag, Mathematics (CICM)\<close>, number 11006 in Lecture Notes in Computer Science. Springer-Verlag,
Heidelberg, 2018. \href{https://doi.org/10.1007/978-3-319-96812-4_3} Heidelberg, 2018. \href{https://doi.org/10.1007/978-3-319-96812-4\_3}
{10.1007/978-3-319-96812-4\_3}. {10.1007/978-3-319-96812-4\_3}.
\end{quote} \end{quote}
A \<^BibTeX>-entry is available at: A \<^BibTeX>-entry is available at:
\<^url>\<open>https://www.brucker.ch/bibliography/abstract/brucker.ea-isabelle-ontologies-2018\<close>. \<^url>\<open>https://www.brucker.ch/bibliography/abstract/brucker.ea-isabelle-ontologies-2018\<close>.
\<^item> for the implementation of \<^isadof>~\<^cite>\<open>"brucker.ea:isabelledof:2019"\<close>: \<^item> for the implementation of \<^isadof>~@{cite "brucker.ea:isabelledof:2019"}:
\begin{quote}\small \begin{quote}\small
A.~D. Brucker and B.~Wolff. \<^isadof>: Design and implementation. In P.C.~{\"O}lveczky and A.~D. Brucker and B.~Wolff. \<^isadof>: Design and implementation. In P.C.~{\"O}lveczky and
G.~Sala{\"u}n, editors, \<^emph>\<open>Software Engineering and Formal Methods (SEFM)\<close>, number 11724 in G.~Sala{\"u}n, editors, \<^emph>\<open>Software Engineering and Formal Methods (SEFM)\<close>, number 11724 in

View File

@ -27,17 +27,17 @@ figure*[architecture::figure,relative_width="95",file_src="''figures/isabelle-ar
text*[bg::introduction]\<open> text*[bg::introduction]\<open>
While Isabelle is widely perceived as an interactive theorem While Isabelle is widely perceived as an interactive theorem
prover for HOL (Higher-order Logic)~\<^cite>\<open>"nipkow.ea:isabelle:2002"\<close>, we would like to emphasize prover for HOL (Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we would like to emphasize
the view that Isabelle is far more than that: it is the \<^emph>\<open>Eclipse of Formal Methods Tools\<close>. This the view that Isabelle is far more than that: it is the \<^emph>\<open>Eclipse of Formal Methods Tools\<close>. This
refers to the ``\<^emph>\<open>generic system framework of Isabelle/Isar underlying recent versions of Isabelle. refers to the ``\<^emph>\<open>generic system framework of Isabelle/Isar underlying recent versions of Isabelle.
Among other things, Isabelle provides an infrastructure for Isabelle plug-ins, comprising extensible Among other things, Isabelle provides an infrastructure for Isabelle plug-ins, comprising extensible
state components and extensible syntax that can be bound to SML programs. Thus, the Isabelle state components and extensible syntax that can be bound to SML programs. Thus, the Isabelle
architecture may be understood as an extension and refinement of the traditional `LCF approach', architecture may be understood as an extension and refinement of the traditional `LCF approach',
with explicit infrastructure for building derivative systems.\<close>''~\<^cite>\<open>"wenzel.ea:building:2007"\<close> with explicit infrastructure for building derivative systems.\<close>''~@{cite "wenzel.ea:building:2007"}
The current system framework offers moreover the following features: The current system framework offers moreover the following features:
\<^item> a build management grouping components into to pre-compiled sessions, \<^item> a build management grouping components into to pre-compiled sessions,
\<^item> a prover IDE (PIDE) framework~\<^cite>\<open>"wenzel:asynchronous:2014"\<close> with various front-ends, \<^item> a prover IDE (PIDE) framework~@{cite "wenzel:asynchronous:2014"} with various front-ends,
\<^item> documentation-generation, \<^item> documentation-generation,
\<^item> code generators for various target languages, \<^item> code generators for various target languages,
\<^item> an extensible front-end language Isabelle/Isar, and, \<^item> an extensible front-end language Isabelle/Isar, and,
@ -118,9 +118,9 @@ text\<open> A simple text-element \<^index>\<open>text-element\<close> may look
@{boxed_theory_text [display]\<open> @{boxed_theory_text [display]\<open>
text\<open> This is a simple text.\<close>\<close>} text\<open> This is a simple text.\<close>\<close>}
\ldots so it is a command \<^theory_text>\<open>text\<close> followed by an argument (here in \<open>\<open> ... \<close>\<close> parenthesis) which ... so it is a command \<^theory_text>\<open>text\<close> followed by an argument (here in \<open>\<open> ... \<close>\<close> parenthesis) which
contains characters. While \<^theory_text>\<open>text\<close>-elements play a major role in this manual---document contains characters. While \<^theory_text>\<open>text\<close>- elements play a major role in this manual --- document
generation is the main use-case of \<^dof> in its current stage---it is important to note that there generation is the main use-case of \<^dof> in its current stage --- it is important to note that there
are actually three families of ``ontology aware'' document elements with analogous are actually three families of ``ontology aware'' document elements with analogous
syntax to standard ones. The difference is a bracket with meta-data of the form: syntax to standard ones. The difference is a bracket with meta-data of the form:
@{theory_text [display,indent=5, margin=70] @{theory_text [display,indent=5, margin=70]
@ -130,7 +130,7 @@ ML*[label::classid, attr\<^sub>1=E\<^sub>1, ... attr\<^sub>n=E\<^sub>n]\<open> s
value*[label::classid, attr\<^sub>1=E\<^sub>1, ... attr\<^sub>n=E\<^sub>n]\<open> some annotated \<lambda>-term \<close> value*[label::classid, attr\<^sub>1=E\<^sub>1, ... attr\<^sub>n=E\<^sub>n]\<open> some annotated \<lambda>-term \<close>
\<close>} \<close>}
Other instances of document elements belonging to these families are, for example, the free-form Other instances of document elements belonging to these families are, for example, the freeform
\<^theory_text>\<open>Definition*\<close> and \<^theory_text>\<open>Lemma*\<close> as well as their formal counterparts \<^theory_text>\<open>definition*\<close> and \<^theory_text>\<open>lemma*\<close>, \<^theory_text>\<open>Definition*\<close> and \<^theory_text>\<open>Lemma*\<close> as well as their formal counterparts \<^theory_text>\<open>definition*\<close> and \<^theory_text>\<open>lemma*\<close>,
which allow in addition to their standard Isabelle functionality the creation and management of which allow in addition to their standard Isabelle functionality the creation and management of
ontology-generated meta-data associated to them (cf. -@{text_section (unchecked) "writing_doc"}). ontology-generated meta-data associated to them (cf. -@{text_section (unchecked) "writing_doc"}).
@ -151,7 +151,7 @@ to switch from a text-context to a term-context, for example. Therefore, antiquo
literature, this concept has been referred to \<open>Cascade-Syntax\<close> and was used in the literature, this concept has been referred to \<open>Cascade-Syntax\<close> and was used in the
Centaur-system and is existing in some limited form in some Emacs-implementations these days. \<close> Centaur-system and is existing in some limited form in some Emacs-implementations these days. \<close>
Isabelle comes with a number of built-in antiquotations for text- and code-contexts; Isabelle comes with a number of built-in antiquotations for text- and code-contexts;
a detailed overview can be found in \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>. \<^dof> reuses this general a detailed overview can be found in @{cite "wenzel:isabelle-isar:2020"}. \<^dof> reuses this general
infrastructure but \<^emph>\<open>generates\<close> its own families of antiquotations from ontologies.\<close> infrastructure but \<^emph>\<open>generates\<close> its own families of antiquotations from ontologies.\<close>
text\<open> An example for a text-element \<^index>\<open>text-element\<close> using built-in antoquotations text\<open> An example for a text-element \<^index>\<open>text-element\<close> using built-in antoquotations
@ -203,28 +203,28 @@ text\<open>
(protected by a kernel). This includes, \<^eg>, ProofPower, HOL4, HOL-light, Isabelle, or Coq (protected by a kernel). This includes, \<^eg>, ProofPower, HOL4, HOL-light, Isabelle, or Coq
and its derivatives. \<^dof> is, however, designed for fast interaction in an IDE. If a user wants and its derivatives. \<^dof> is, however, designed for fast interaction in an IDE. If a user wants
to benefit from this experience, only Isabelle and Coq have the necessary infrastructure of to benefit from this experience, only Isabelle and Coq have the necessary infrastructure of
asynchronous proof-processing and support by a PIDE~\<^cite>\<open>"wenzel:asynchronous:2014" and asynchronous proof-processing and support by a PIDE~@{cite "wenzel:asynchronous:2014" and
"wenzel:system:2014" and "barras.ea:pervasive:2013" and "faithfull.ea:coqoon:2018"\<close> which "wenzel:system:2014" and "barras.ea:pervasive:2013" and "faithfull.ea:coqoon:2018"} which
in many features over-accomplishes the required features of \<^dof>. in many features over-accomplishes the required features of \<^dof>.
\<close> \<close>
figure*["fig_dof_ide",relative_width="95",file_src="''figures/cicm2018-combined.png''"]\<open> figure*["fig_dof_ide",relative_width="95",file_src="''figures/cicm2018-combined.png''"]\<open>
The \<^isadof> IDE (left) and the corresponding PDF (right), showing the first page The \<^isadof> IDE (left) and the corresponding PDF (right), showing the first page
of~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>.\<close> of~@{cite "brucker.ea:isabelle-ontologies:2018"}.\<close>
text\<open> text\<open>
We call the present implementation of \<^dof> on the Isabelle platform \<^isadof> . We call the present implementation of \<^dof> on the Isabelle platform \<^isadof> .
@{figure "fig_dof_ide"} shows a screenshot of an introductory paper on @{figure "fig_dof_ide"} shows a screen-shot of an introductory paper on
\<^isadof>~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>: the \<^isadof> PIDE can be seen on the left, \<^isadof>~@{cite "brucker.ea:isabelle-ontologies:2018"}: the \<^isadof> PIDE can be seen on the left,
while the generated presentation in PDF is shown on the right. while the generated presentation in PDF is shown on the right.
Isabelle provides, beyond the features required for \<^dof>, a lot of additional benefits. Isabelle provides, beyond the features required for \<^dof>, a lot of additional benefits.
Besides UTF8-support for characters used in text-elements, Isabelle offers built-in already a Besides UTF8-support for characters used in text-elements, Isabelle offers built-in already a
mechanism for user-programmable antiquotations \<^index>\<open>antiquotations\<close> which we use to implement mechanism for user-programmable antiquotations \<^index>\<open>antiquotations\<close> which we use to implement
semantic macros \<^index>\<open>semantic macros\<close> in \<^isadof> (We will actually use these two terms semantic macros \<^index>\<open>semantic macros\<close> in \<^isadof> (We will actually use these two terms
as synonym in the context of \<^isadof>). Moreover, \<^isadof> allows for the asynchronous as synonym in the context of \<^isadof>). Moreover, \<^isadof> allows for the asynchronous
evaluation and checking of the document content~\<^cite>\<open>"wenzel:asynchronous:2014" and evaluation and checking of the document content~@{cite "wenzel:asynchronous:2014" and
"wenzel:system:2014" and "barras.ea:pervasive:2013"\<close> and is dynamically extensible. Its PIDE "wenzel:system:2014" and "barras.ea:pervasive:2013"} and is dynamically extensible. Its PIDE
provides a \<^emph>\<open>continuous build, continuous check\<close> functionality, syntax highlighting, and provides a \<^emph>\<open>continuous build, continuous check\<close> functionality, syntax highlighting, and
auto-completion. It also provides infrastructure for displaying meta-information (\<^eg>, binding auto-completion. It also provides infrastructure for displaying meta-information (\<^eg>, binding
and type annotation) as pop-ups, while hovering over sub-expressions. A fine-grained dependency and type annotation) as pop-ups, while hovering over sub-expressions. A fine-grained dependency

View File

@ -74,15 +74,15 @@ is currently consisting out of three AFP entries:
contains the \<^isadof> system itself, including the \<^isadof> manual. contains the \<^isadof> system itself, including the \<^isadof> manual.
\<^item> \<^verbatim>\<open>Isabelle_DOF-Example-I\<close>: This entry contains an example of \<^item> \<^verbatim>\<open>Isabelle_DOF-Example-I\<close>: This entry contains an example of
an academic paper written using the \<^isadof> system oriented towards an an academic paper written using the \<^isadof> system oriented towards an
introductory paper. The text is based on~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>; introductory paper. The text is based on~@{cite "brucker.ea:isabelle-ontologies:2018"};
in the document, we deliberately refrain from integrating references to formal content in order in the document, we deliberately refrain from integrating references to formal content in order
to demonstrate that \<^isadof> can be used for writing documents with very little direct use of to demonstrate that \<^isadof> can be used for writing documents with very little direct use of
\<^LaTeX>. \<^LaTeX>.
\<^item> \<^verbatim>\<open>Isabelle_DOF-Example-II\<close>: This entry contains another example of \<^item> \<^verbatim>\<open>Isabelle_DOF-Example-II\<close>: This entry contains another example of
a mathematics-oriented academic paper. It is based on~\<^cite>\<open>"taha.ea:philosophers:2020"\<close>. a mathematics-oriented academic paper. It is based on~@{cite "taha.ea:philosophers:2020"}.
It represents a typical mathematical text, heavy in definitions with complex mathematical notation It represents a typical mathematical text, heavy in definitions with complex mathematical notation
and a lot of non-trivial cross-referencing between statements, definitions, and proofs which and a lot of non-trivial cross-referencing between statements, definitions and proofs which
are ontologically tracked. However, with respect to the possible linking between the underlying formal theory are ontologically tracked. However, wrt. the possible linking between the underlying formal theory
and this mathematical presentation, it follows a pragmatic path without any ``deep'' linking to and this mathematical presentation, it follows a pragmatic path without any ``deep'' linking to
types, terms and theorems, and therefore does deliberately not exploit \<^isadof> 's full potential.\<close> types, terms and theorems, and therefore does deliberately not exploit \<^isadof> 's full potential.\<close>
@ -177,7 +177,7 @@ There are actually different name categories that form a proper name space, \<^e
constant symbols and type symbols are distinguished. constant symbols and type symbols are distinguished.
Additionally, \<^isadof> objects also come with a proper name space: classes (and monitors), instances, Additionally, \<^isadof> objects also come with a proper name space: classes (and monitors), instances,
low-level class invariants (SML-defined invariants) all follow the lexical conventions of low-level class invariants (SML-defined invariants) all follow the lexical conventions of
Isabelle. For instance, a class can be referenced outside its theory using Isabelle. For instance, a class can be referenced outside of its theory using
its short-name or its long-name if another class with the same name is defined its short-name or its long-name if another class with the same name is defined
in the current theory. in the current theory.
Isabelle identifies names already with the shortest suffix that is unique in the global Isabelle identifies names already with the shortest suffix that is unique in the global
@ -188,7 +188,7 @@ be printed with different prefixes according to their uniqueness.
subsection*[cartouches::example]\<open>Caveat: Lexical Conventions of Cartouches, Strings, Names, ... \<close> subsection*[cartouches::example]\<open>Caveat: Lexical Conventions of Cartouches, Strings, Names, ... \<close>
text\<open>WARNING: The embedding of strings, terms, names \<^etc>, as parts of commands, anti-quotations, text\<open>WARNING: The embedding of strings, terms, names \<^etc>, as parts of commands, anti-quotations,
terms, \<^etc>, is unfortunately not always so consistent as one might expect, when it comes terms, \<^etc>, is unfortunately not always so consistent as one might expect, when it comes
to variants that should be lexically equivalent in principle. This can be a nuisance for to variants that should be lexically equivalent in principle. This can be a nuisance for
users, but is again a consequence that we build on existing technology that has been developed users, but is again a consequence that we build on existing technology that has been developed
over decades. over decades.
@ -289,7 +289,7 @@ contained in the theory \<^theory>\<open>Isabelle_DOF.scholarly_paper\<close>. \
subsection\<open>Writing Academic Publications: A Freeform Mathematics Text \<close> subsection\<open>Writing Academic Publications: A Freeform Mathematics Text \<close>
text*[csp_paper_synthesis::technical, main_author = "Some bu"]\<open>We present a typical mathematical text*[csp_paper_synthesis::technical, main_author = "Some bu"]\<open>We present a typical mathematical
paper focusing on its form, not referring in any sense to its content which is out of scope here. paper focusing on its form, not referring in any sense to its content which is out of scope here.
As mentioned before, we chose the paper~\<^cite>\<open>"taha.ea:philosophers:2020"\<close> for this purpose, As mentioned before, we chose the paper~@{cite "taha.ea:philosophers:2020"} for this purpose,
which is written in the so-called free-form style: Formulas are superficially parsed and which is written in the so-called free-form style: Formulas are superficially parsed and
type-set, but no deeper type-checking and checking with the underlying logical context type-set, but no deeper type-checking and checking with the underlying logical context
is undertaken. \<close> is undertaken. \<close>
@ -629,7 +629,7 @@ text\<open> This is *\<open>emphasized\<close> and this is a
citation @{cite "brucker.ea:isabelle-ontologies:2018"}.\<close> citation @{cite "brucker.ea:isabelle-ontologies:2018"}.\<close>
\<close>} \<close>}
The list of standard Isabelle document antiquotations, as well as their options and styles, The list of standard Isabelle document antiquotations, as well as their options and styles,
can be found in the Isabelle reference manual \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>, can be found in the Isabelle reference manual @{cite "wenzel:isabelle-isar:2020"},
section 4.2. section 4.2.
In practice, \<^isadof> documents with ambitious layout will contain a certain number of In practice, \<^isadof> documents with ambitious layout will contain a certain number of
@ -638,7 +638,7 @@ not possible. As far as possible, raw \<^LaTeX> should be restricted to the defi
of ontologies and macros (see @{docitem (unchecked) \<open>isadof_ontologies\<close>}). If raw of ontologies and macros (see @{docitem (unchecked) \<open>isadof_ontologies\<close>}). If raw
\<^LaTeX> commands can not be avoided, it is recommended to use the Isabelle document comment \<^LaTeX> commands can not be avoided, it is recommended to use the Isabelle document comment
\<^latex>\<open>\verb+\+\verb+<^latex>+\<close>\<open>\<open>argument\<close>\<close> to isolate these parts \<^latex>\<open>\verb+\+\verb+<^latex>+\<close>\<open>\<open>argument\<close>\<close> to isolate these parts
(cf. \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>). (cf. @{cite "wenzel:isabelle-isar:2020"}).
Restricting the use of \<^LaTeX> has two advantages: first, \<^LaTeX> commands can circumvent the Restricting the use of \<^LaTeX> has two advantages: first, \<^LaTeX> commands can circumvent the
consistency checks of \<^isadof> and, hence, only if no \<^LaTeX> commands are used, \<^isadof> can consistency checks of \<^isadof> and, hence, only if no \<^LaTeX> commands are used, \<^isadof> can

View File

@ -13,9 +13,9 @@
(*<*) (*<*)
theory theory
"M_06_RefMan" "M_04_RefMan"
imports imports
"M_05_Proofs_Ontologies" "M_03_GuidedTour"
begin begin
declare_reference*[infrastructure::technical] declare_reference*[infrastructure::technical]
@ -30,7 +30,7 @@ text\<open>
representation, and give hints for the development of new document templates. representation, and give hints for the development of new document templates.
\<^isadof> is embedded in the underlying generic document model of Isabelle as described in \<^isadof> is embedded in the underlying generic document model of Isabelle as described in
@{scholarly_paper.introduction \<open>dof\<close>}. Recall that the document language can be extended dynamically, \<^ie>, new \<^introduction>\<open>dof\<close>. Recall that the document language can be extended dynamically, \<^ie>, new
\<^emph>\<open>user-defined\<close> can be introduced at run-time. This is similar to the definition of new functions \<^emph>\<open>user-defined\<close> can be introduced at run-time. This is similar to the definition of new functions
in an interpreter. \<^isadof> as a system plugin provides a number of new command definitions in in an interpreter. \<^isadof> as a system plugin provides a number of new command definitions in
Isabelle's document model. Isabelle's document model.
@ -119,7 +119,7 @@ text\<open>
\<^item> attributes may have default values in order to facilitate notation. \<^item> attributes may have default values in order to facilitate notation.
\<^boxed_theory_text>\<open>doc_class\<close>'es and \<^boxed_theory_text>\<open>onto_class\<close>'es respectively, have a semantics, \<^boxed_theory_text>\<open>doc_class\<close>'es and \<^boxed_theory_text>\<open>onto_class\<close>'es respectively, have a semantics,
\<^ie>, a logical representation as extensible records in HOL (\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>, \<^ie>, a logical representation as extensible records in HOL (@{cite "wenzel:isabelle-isar:2020"},
pp. 11.6); there are therefore amenable to logical reasoning. pp. 11.6); there are therefore amenable to logical reasoning.
\<close> \<close>
@ -169,25 +169,25 @@ text\<open>
As ODL is an extension of Isabelle/HOL, document class definitions can therefore be arbitrarily As ODL is an extension of Isabelle/HOL, document class definitions can therefore be arbitrarily
mixed with standard HOL specification constructs. To make this manual self-contained, we present mixed with standard HOL specification constructs. To make this manual self-contained, we present
syntax and semantics of the specification constructs that are most likely relevant for the syntax and semantics of the specification constructs that are most likely relevant for the
developer of ontologies (for more details, see~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>). Our developer of ontologies (for more details, see~@{cite "wenzel:isabelle-isar:2020"}). Our
presentation is a simplification of the original sources following the needs of ontology developers presentation is a simplification of the original sources following the needs of ontology developers
in \<^isadof>: in \<^isadof>:
\<^item> \<open>name\<close>:\<^index>\<open>name@\<open>name\<close>\<close> \<^item> \<open>name\<close>:\<^index>\<open>name@\<open>name\<close>\<close>
with the syntactic category of \<open>name\<close>'s we refer to alpha-numerical identifiers with the syntactic category of \<open>name\<close>'s we refer to alpha-numerical identifiers
(called \<open>short_ident\<close>'s in \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>) and identifiers (called \<open>short_ident\<close>'s in @{cite "wenzel:isabelle-isar:2020"}) and identifiers
in \<^boxed_theory_text>\<open>...\<close> which might contain certain ``quasi-letters'' such in \<^boxed_theory_text>\<open>...\<close> which might contain certain ``quasi-letters'' such
as \<^boxed_theory_text>\<open>_\<close>, \<^boxed_theory_text>\<open>-\<close>, \<^boxed_theory_text>\<open>.\<close> (see~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close> for as \<^boxed_theory_text>\<open>_\<close>, \<^boxed_theory_text>\<open>-\<close>, \<^boxed_theory_text>\<open>.\<close> (see~@{cite "wenzel:isabelle-isar:2020"} for
details). details).
% TODO % TODO
% This phrase should be reviewed to clarify identifiers. % This phrase should be reviewed to clarify identifiers.
% Peculiarly, "and identifiers in \<^boxed_theory_text>\<open> ... \<close>". % Peculiarly, "and identifiers in \<^boxed_theory_text>\<open> ... \<close>".
\<^item> \<open>tyargs\<close>:\<^index>\<open>tyargs@\<open>tyargs\<close>\<close> \<^item> \<open>tyargs\<close>:\<^index>\<open>tyargs@\<open>tyargs\<close>\<close>
\<^rail>\<open> typefree | ('(' (typefree * ',') ')')\<close> \<^rail>\<open> typefree | ('(' (typefree * ',') ')')\<close>
\<open>typefree\<close> denotes fixed type variable (\<open>'a\<close>, \<open>'b\<close>, ...) (see~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>) \<open>typefree\<close> denotes fixed type variable (\<open>'a\<close>, \<open>'b\<close>, ...) (see~@{cite "wenzel:isabelle-isar:2020"})
\<^item> \<open>dt_name\<close>:\<^index>\<open>dt\_npurdahame@\<open>dt_name\<close>\<close> \<^item> \<open>dt_name\<close>:\<^index>\<open>dt\_npurdahame@\<open>dt_name\<close>\<close>
\<^rail>\<open> (tyargs?) name (mixfix?)\<close> \<^rail>\<open> (tyargs?) name (mixfix?)\<close>
The syntactic entity \<open>name\<close> denotes an identifier, \<open>mixfix\<close> denotes the usual The syntactic entity \<open>name\<close> denotes an identifier, \<open>mixfix\<close> denotes the usual
parenthesized mixfix notation (see \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>). parenthesized mixfix notation (see @{cite "wenzel:isabelle-isar:2020"}).
The \<^emph>\<open>name\<close>'s referred here are type names such as \<^typ>\<open>int\<close>, \<^typ>\<open>string\<close>, The \<^emph>\<open>name\<close>'s referred here are type names such as \<^typ>\<open>int\<close>, \<^typ>\<open>string\<close>,
\<^type>\<open>list\<close>, \<^type>\<open>set\<close>, etc. \<^type>\<open>list\<close>, \<^type>\<open>set\<close>, etc.
\<^item> \<open>type_spec\<close>:\index{type_spec@\<open>type_spec\<close>} \<^item> \<open>type_spec\<close>:\index{type_spec@\<open>type_spec\<close>}
@ -212,13 +212,13 @@ text\<open>
% Show string in the document output for \<^boxed_theory_text>\<open>ab c\<close> (strings) % Show string in the document output for \<^boxed_theory_text>\<open>ab c\<close> (strings)
\<^boxed_theory_text>\<open>{1,2,3}\<close> (sets), \<^boxed_theory_text>\<open>(1,2,3)\<close> (tuples), \<^boxed_theory_text>\<open>{1,2,3}\<close> (sets), \<^boxed_theory_text>\<open>(1,2,3)\<close> (tuples),
\<^boxed_theory_text>\<open>\<forall> x. P(x) \<and> Q x = C\<close> (formulas). For comprehensive overview, \<^boxed_theory_text>\<open>\<forall> x. P(x) \<and> Q x = C\<close> (formulas). For comprehensive overview,
see~\<^cite>\<open>"nipkow:whats:2020"\<close>. see~@{cite "nipkow:whats:2020"}.
\<close> \<close>
text\<open> text\<open>
Advanced ontologies can, \<^eg>, use recursive function definitions with Advanced ontologies can, \<^eg>, use recursive function definitions with
pattern-matching~\<^cite>\<open>"kraus:defining:2020"\<close>, extensible record pattern-matching~@{cite "kraus:defining:2020"}, extensible record
specifications~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>, and abstract type declarations. specifications~@{cite "wenzel:isabelle-isar:2020"}, and abstract type declarations.
\<close> \<close>
text\<open>\<^isadof> works internally with fully qualified names in order to avoid confusions text\<open>\<^isadof> works internally with fully qualified names in order to avoid confusions
@ -290,7 +290,7 @@ text\<open>
class representation, the identifier \<^boxed_latex>\<open>#1\<close> refers to the content of the main text of the class representation, the identifier \<^boxed_latex>\<open>#1\<close> refers to the content of the main text of the
document class (written in \<^boxed_theory_text>\<open>\<open> ... \<close>\<close>) and the attributes can be referenced document class (written in \<^boxed_theory_text>\<open>\<open> ... \<close>\<close>) and the attributes can be referenced
by their name using the \<^ltxinline>\<open>\commandkey{...}\<close>-command (see the documentation of the by their name using the \<^ltxinline>\<open>\commandkey{...}\<close>-command (see the documentation of the
\<^LaTeX>-package ``keycommand''~\<^cite>\<open>"chervet:keycommand:2010"\<close> for details). Usually, the \<^LaTeX>-package ``keycommand''~@{cite "chervet:keycommand:2010"} for details). Usually, the
representations definition needs to be wrapped in a representations definition needs to be wrapped in a
\<^ltxinline>\<open>\begin{isarmarkup}...\end{isamarkup}\<close>-environment, to ensure the correct context \<^ltxinline>\<open>\begin{isarmarkup}...\end{isamarkup}\<close>-environment, to ensure the correct context
within Isabelle's \<^LaTeX>-setup. within Isabelle's \<^LaTeX>-setup.
@ -353,7 +353,7 @@ text\<open>
as specified in @{technical \<open>odl_manual0\<close>}. as specified in @{technical \<open>odl_manual0\<close>}.
\<^item> \<open>meta_args\<close> : \<^item> \<open>meta_args\<close> :
\<^rail>\<open>obj_id ('::' class_id) ((',' attribute '=' HOL_term) *) \<close> \<^rail>\<open>obj_id ('::' class_id) ((',' attribute '=' HOL_term) *) \<close>
\<^item> \<^emph>\<open>evaluator\<close>: from \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>, evaluation is tried first using ML, \<^item> \<^emph>\<open>evaluator\<close>: from @{cite "wenzel:isabelle-isar:2020"}, evaluation is tried first using ML,
falling back to normalization by evaluation if this fails. Alternatively a specific falling back to normalization by evaluation if this fails. Alternatively a specific
evaluator can be selected using square brackets; typical evaluators use the evaluator can be selected using square brackets; typical evaluators use the
current set of code equations to normalize and include \<open>simp\<close> for fully current set of code equations to normalize and include \<open>simp\<close> for fully
@ -433,7 +433,7 @@ term antiquotations:
will get the value of the \<^const>\<open>definition_list\<close> attribute of the instance \<open>odl-manual1\<close>. will get the value of the \<^const>\<open>definition_list\<close> attribute of the instance \<open>odl-manual1\<close>.
\<^theory_text>\<open>value_\<close> may have an optional argument between square brackets to specify the evaluator: \<^theory_text>\<open>value_\<close> may have an optional argument between square brackets to specify the evaluator:
\<^theory_text>\<open>@{value_ [nbe] \<open>definition_list @{technical \<open>odl-manual1\<close>}\<close>}\<close> forces \<open>value_\<close> to evaluate \<^theory_text>\<open>@{value_ [nbe] \<open>definition_list @{technical \<open>odl-manual1\<close>}\<close>}\<close> forces \<open>value_\<close> to evaluate
the term using normalization by evaluation (see \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>). the term using normalization by evaluation (see @{cite "wenzel:isabelle-isar:2020"}).
\<close> \<close>
(*<*) (*<*)
@ -451,7 +451,7 @@ text*[cc_assumption_test_ref::cc_assumption_test]\<open>\<close>
definition tag_l :: "'a \<Rightarrow> 'b \<Rightarrow> 'b" where "tag_l \<equiv> \<lambda>x y. y" definition tag_l :: "'a \<Rightarrow> 'b \<Rightarrow> 'b" where "tag_l \<equiv> \<lambda>x y. y"
lemma* tagged : "tag_l @{cc_assumption_test \<open>cc_assumption_test_ref\<close>} AAA \<Longrightarrow> AAA" lemma* tagged : "tag_l @{cc_assumption_test \<open>cc_assumption_test_ref\<close>} AA \<Longrightarrow> AA"
by (simp add: tag_l_def) by (simp add: tag_l_def)
find_theorems name:tagged "(_::cc_assumption_test \<Rightarrow> _ \<Rightarrow> _) _ _ \<Longrightarrow>_" find_theorems name:tagged "(_::cc_assumption_test \<Rightarrow> _ \<Rightarrow> _) _ _ \<Longrightarrow>_"
@ -495,7 +495,7 @@ The major commands providing term-contexts are\<^footnote>\<open>The meta-argume
Wrt. creation, checking and traceability, these commands are analogous to the ontological text and Wrt. creation, checking and traceability, these commands are analogous to the ontological text and
code-commands. However the argument terms may contain term-antiquotations stemming from an code-commands. However the argument terms may contain term-antiquotations stemming from an
ontology definition. Term-contexts were type-checked and \<^emph>\<open>validated\<close> against ontology definition. Term-contexts were type-checked and \<^emph>\<open>validated\<close> against
the global context (so: in the term @{term_ [source] \<open>@{scholarly_paper.author \<open>bu\<close>}\<close>}, \<open>bu\<close> the global context (so: in the term @{term_ [source] \<open>@{author \<open>bu\<close>}\<close>}, \<open>bu\<close>
is indeed a string which refers to a meta-object belonging to the document class \<^typ>\<open>author\<close>, is indeed a string which refers to a meta-object belonging to the document class \<^typ>\<open>author\<close>,
for example). With the exception of the @{command "term*"}-command, the term-antiquotations for example). With the exception of the @{command "term*"}-command, the term-antiquotations
in the other term-contexts are additionally expanded (\<^eg> replaced) by the instance of the class, in the other term-contexts are additionally expanded (\<^eg> replaced) by the instance of the class,
@ -514,7 +514,7 @@ with the \<^boxed_theory_text>\<open>disable_assert_evaluation\<close> theory at
Then @{command "assert*"} will act like @{command "term*"}. Then @{command "assert*"} will act like @{command "term*"}.
The @{command "definition*"}-command allows \<open>prop\<close>, \<open>spec_prems\<close>, and \<open>for_fixes\<close> The @{command "definition*"}-command allows \<open>prop\<close>, \<open>spec_prems\<close>, and \<open>for_fixes\<close>
(see the @{command "definition"} command in \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>) to contain (see the @{command "definition"} command in @{cite "wenzel:isabelle-isar:2020"}) to contain
term-antiquotations. For example: term-antiquotations. For example:
@{boxed_theory_text [display] @{boxed_theory_text [display]
\<open>doc_class A = \<open>doc_class A =
@ -525,7 +525,7 @@ definition*[a1::A, x=5, level="Some 1"] xx' where "xx' \<equiv> A.x @{A \<open>a
The @{term_ [source] \<open>@{A \<open>a1\<close>}\<close>} term-antiquotation is used both in \<open>prop\<close> and in \<open>spec_prems\<close>. The @{term_ [source] \<open>@{A \<open>a1\<close>}\<close>} term-antiquotation is used both in \<open>prop\<close> and in \<open>spec_prems\<close>.
@{command "lemma*"}, @{command "theorem*"}, etc., are extended versions of the goal commands @{command "lemma*"}, @{command "theorem*"}, etc., are extended versions of the goal commands
defined in \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>. Term-antiquotations can be used either in defined in @{cite "wenzel:isabelle-isar:2020"}. Term-antiquotations can be used either in
a \<open>long_statement\<close> or in a \<open>short_statement\<close>. a \<open>long_statement\<close> or in a \<open>short_statement\<close>.
For instance: For instance:
@{boxed_theory_text [display] @{boxed_theory_text [display]
@ -671,7 +671,7 @@ doc_class text_element =
variants :: "String.literal set" <= "{STR ''outline'', STR ''document''}" variants :: "String.literal set" <= "{STR ''outline'', STR ''document''}"
\<close>} \<close>}
As mentioned in @{scholarly_paper.technical \<open>sss\<close>}, As mentioned in @{technical \<open>sss\<close>},
\<^const>\<open>level\<close> defines the section-level (\<^eg>, using a \<^LaTeX>-inspired hierarchy: \<^const>\<open>level\<close> defines the section-level (\<^eg>, using a \<^LaTeX>-inspired hierarchy:
from \<^boxed_theory_text>\<open>Some -1\<close> (corresponding to \<^boxed_latex>\<open>\part\<close>) to from \<^boxed_theory_text>\<open>Some -1\<close> (corresponding to \<^boxed_latex>\<open>\part\<close>) to
\<^boxed_theory_text>\<open>Some 0\<close> (corresponding to \<^boxed_latex>\<open>\chapter\<close>, respectively, \<^boxed_theory_text>\<open>chapter*\<close>) \<^boxed_theory_text>\<open>Some 0\<close> (corresponding to \<^boxed_latex>\<open>\chapter\<close>, respectively, \<^boxed_theory_text>\<open>chapter*\<close>)
@ -807,11 +807,11 @@ doc_class text_section = text_element +
Besides attributes of more practical considerations like a \<^const>\<open>fixme_list\<close>, that can be modified Besides attributes of more practical considerations like a \<^const>\<open>fixme_list\<close>, that can be modified
during the editing process but is only visible in the integrated source but usually ignored in the during the editing process but is only visible in the integrated source but usually ignored in the
\<^LaTeX>, this class also introduces the possibility to assign an ``ownership'' or ``responsibility'' of \<^LaTeX>, this class also introduces the possibility to assign an ``ownership" or ``responsibility" of
a \<^typ>\<open>text_element\<close> to a specific \<^typ>\<open>author\<close>. Note that this is possible since \<^isadof> assigns to each a \<^typ>\<open>text_element\<close> to a specific \<^typ>\<open>author\<close>. Note that this is possible since \<^isadof> assigns to each
document class also a class-type which is declared in the HOL environment.\<close> document class also a class-type which is declared in the HOL environment.\<close>
text*[s23::example, main_author = "Some(@{scholarly_paper.author \<open>bu\<close>})"]\<open> text*[s23::example, main_author = "Some(@{author \<open>bu\<close>})"]\<open>
Recall that concrete authors can be denoted by term-antiquotations generated by \<^isadof>; for example, Recall that concrete authors can be denoted by term-antiquotations generated by \<^isadof>; for example,
this may be for a text fragment like this may be for a text fragment like
@{boxed_theory_text [display] @{boxed_theory_text [display]
@ -960,7 +960,7 @@ subsubsection\<open>For Isabelle Hackers: Defining New Top-Level Commands\<close
text\<open> text\<open>
Defining such new top-level commands requires some Isabelle knowledge as well as Defining such new top-level commands requires some Isabelle knowledge as well as
extending the dispatcher of the \<^LaTeX>-backend. For the details of defining top-level extending the dispatcher of the \<^LaTeX>-backend. For the details of defining top-level
commands, we refer the reader to the Isar manual~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>. commands, we refer the reader to the Isar manual~@{cite "wenzel:isabelle-isar:2020"}.
Here, we only give a brief example how the \<^boxed_theory_text>\<open>section*\<close>-command is defined; we Here, we only give a brief example how the \<^boxed_theory_text>\<open>section*\<close>-command is defined; we
refer the reader to the source code of \<^isadof> for details. refer the reader to the source code of \<^isadof> for details.
@ -1157,7 +1157,7 @@ text\<open>
The \<^boxed_theory_text>\<open>\<sigma>\<close> symbol is reserved and references the future instance class. The \<^boxed_theory_text>\<open>\<sigma>\<close> symbol is reserved and references the future instance class.
By relying on the implementation of the Records By relying on the implementation of the Records
in Isabelle/HOL~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>, in Isabelle/HOL~@{cite "wenzel:isabelle-isar:2020"},
one can reference an attribute of an instance using its selector function. one can reference an attribute of an instance using its selector function.
For example, \<^boxed_theory_text>\<open>int1 \<sigma>\<close> denotes the value For example, \<^boxed_theory_text>\<open>int1 \<sigma>\<close> denotes the value
of the \<^boxed_theory_text>\<open>int1\<close> attribute of the \<^boxed_theory_text>\<open>int1\<close> attribute
@ -1370,23 +1370,6 @@ value*\<open>filter (\<lambda>\<sigma>. the (text_section.level \<sigma>) > 1) @
section*[infrastructure::technical]\<open>Technical Infrastructure\<close> section*[infrastructure::technical]\<open>Technical Infrastructure\<close>
subsection\<open>The Previewer\<close>
figure*[
global_DOF_view::figure
, relative_width="95"
, file_src="''figures/ThisPaperWithPreviewer.png''"
]\<open>A Screenshot while editing this Paper in \<^dof> with Preview.\<close>
text\<open>A screenshot of the editing environment is shown in \<^figure>\<open>global_DOF_view\<close>. It supports
incremental continuous PDF generation which improves usability. Currently, the granularity
is restricted to entire theories (which have to be selected in a specific document pane).
The response times can not (yet) compete
with a Word- or Overleaf editor, though, which is mostly due to the checking and
evaluation overhead (the turnaround of this section is about 30 s). However, we believe
that better parallelization and evaluation techniques will decrease this gap substantially for the
most common cases in future versions. \<close>
subsection\<open>Developing Ontologies and their Representation Mappings\<close> subsection\<open>Developing Ontologies and their Representation Mappings\<close>
text\<open> text\<open>
The document core \<^emph>\<open>may\<close>, but \<^emph>\<open>must\<close> not use Isabelle definitions or proofs for checking the The document core \<^emph>\<open>may\<close>, but \<^emph>\<open>must\<close> not use Isabelle definitions or proofs for checking the
@ -1542,8 +1525,8 @@ text\<open>
Examples of this can be found, \<^eg>, in the ontology-style Examples of this can be found, \<^eg>, in the ontology-style
\<^file>\<open>../../ontologies/scholarly_paper/DOF-scholarly_paper.sty\<close>. \<^file>\<open>../../ontologies/scholarly_paper/DOF-scholarly_paper.sty\<close>.
For details about the expansion mechanism For details about the expansion mechanism
in general, we refer the reader to the \<^LaTeX> literature (\<^eg>,~\<^cite>\<open>"knuth:texbook:1986" in general, we refer the reader to the \<^LaTeX> literature (\<^eg>,~@{cite "knuth:texbook:1986"
and "mittelbach.ea:latex:1999" and "eijkhout:latex-cs:2012"\<close>). and "mittelbach.ea:latex:1999" and "eijkhout:latex-cs:2012"}).
\<close> \<close>

View File

@ -12,9 +12,9 @@
*************************************************************************) *************************************************************************)
(*<*) (*<*)
theory "M_07_Implementation" theory "M_05_Implementation"
imports "M_06_RefMan" imports "M_04_RefMan"
begin begin
(*>*) (*>*)
@ -33,7 +33,7 @@ text\<open>
\<close> \<close>
text\<open> text\<open>
Semantic macros, as required by our document model, are called \<^emph>\<open>document antiquotations\<close> Semantic macros, as required by our document model, are called \<^emph>\<open>document antiquotations\<close>
in the Isabelle literature~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>. While Isabelle's code-antiquotations in the Isabelle literature~@{cite "wenzel:isabelle-isar:2020"}. While Isabelle's code-antiquotations
are an old concept going back to Lisp and having found via SML and OCaml their ways into modern are an old concept going back to Lisp and having found via SML and OCaml their ways into modern
proof systems, special annotation syntax inside documentation comments have their roots in proof systems, special annotation syntax inside documentation comments have their roots in
documentation generators such as Javadoc. Their use, however, as a mechanism to embed documentation generators such as Javadoc. Their use, however, as a mechanism to embed
@ -210,13 +210,13 @@ text\<open>
possible; possible;
otherwise, if \<^boxed_sml>\<open>next\<close> fails, an error is reported. The automata implementation otherwise, if \<^boxed_sml>\<open>next\<close> fails, an error is reported. The automata implementation
is, in large parts, generated from a formalization of functional automata is, in large parts, generated from a formalization of functional automata
\<^cite>\<open>"nipkow.ea:functional-Automata-afp:2004"\<close>. @{cite "nipkow.ea:functional-Automata-afp:2004"}.
\<close> \<close>
section\<open>The \<^LaTeX>-Core of \<^isadof>\<close> section\<open>The \<^LaTeX>-Core of \<^isadof>\<close>
text\<open> text\<open>
The \<^LaTeX>-implementation of \<^isadof> heavily relies on the The \<^LaTeX>-implementation of \<^isadof> heavily relies on the
``keycommand''~\<^cite>\<open>"chervet:keycommand:2010"\<close> package. In fact, the core \<^isadof> \<^LaTeX>-commands ``keycommand''~@{cite "chervet:keycommand:2010"} package. In fact, the core \<^isadof> \<^LaTeX>-commands
are just wrappers for the corresponding commands from the keycommand package: are just wrappers for the corresponding commands from the keycommand package:
@{boxed_latex [display] @{boxed_latex [display]

View File

@ -7,7 +7,7 @@ Isabelle/DOF allows for both conventional typesetting and formal development.
Isabelle/DOF has two major prerequisites: Isabelle/DOF has two major prerequisites:
* **Isabelle 2024:** Isabelle/DOF requires [Isabelle](https://isabelle.in.tum.de/) * **Isabelle 2023:** Isabelle/DOF requires [Isabelle](https://isabelle.in.tum.de/)
and several entries from the [Archive of Formal Proofs and several entries from the [Archive of Formal Proofs
(AFP)](https://www.isa-afp.org/). (AFP)](https://www.isa-afp.org/).
* **LaTeX:** Isabelle/DOF requires a modern LaTeX installation, i.e., at least * **LaTeX:** Isabelle/DOF requires a modern LaTeX installation, i.e., at least
@ -16,6 +16,7 @@ Isabelle/DOF has two major prerequisites:
## Installation ## Installation
<!--
Isabelle/DOF is available as part of the [Archive of Formal Proofs Isabelle/DOF is available as part of the [Archive of Formal Proofs
(AFP)](https://www.isa-afp.org/). This is the most convenient way to install (AFP)](https://www.isa-afp.org/). This is the most convenient way to install
Isabelle/DOF for the latest official release of Isabelle. Isabelle/DOF for the latest official release of Isabelle.
@ -28,21 +29,28 @@ distribution for your operating system from the [Isabelle
website](https://isabelle.in.tum.de/). Furthermore, please install the AFP website](https://isabelle.in.tum.de/). Furthermore, please install the AFP
following the instructions given at <https://www.isa-afp.org/help.html>. following the instructions given at <https://www.isa-afp.org/help.html>.
Isabelle/DOF is provided as one AFP entry: Isabelle/DOF is currently consisting out of three AFP entries:
* [Isabelle_DOF:](https://www.isa-afp.org/entries/Isabelle_DOF.html) This entry * [Isabelle_DOF:](https://www.isa-afp.org/entries/Isabelle_DOF.html) This entry
contains the Isabelle/DOF system itself, including the Isabelle/DOF manual. contains the Isabelle/DOF system itself, including the Isabelle/DOF manual.
* [Isabelle_DOF-Example-I:](https://www.isa-afp.org/entries/Isabelle_DOF-Example-I.html)
This entry contains an example of an academic paper written using the
Isabelle/DOF system.
* [Isabelle_DOF-Example-II:](https://www.isa-afp.org/entries/Isabelle_DOF-Example-II.html)
This entry contains an example of an academic paper written using the
Isabelle/DOF system.
-->
### Installation of the Development Version (Git Repository) ### Installation of the Development Version (Git Repository)
The development version of Isabelle/DOF that is available in this Git repository The development version of Isabelle/DOF that is available in this Git repository
provides, over the AFP version, additional ontologies, document templates, and provides, over the AFP version, additional ontologies, document templates, and
examples that might not yet “ready for general use”. Furthermore, as it is examples that might not yet "ready for general use". Furthermore, as it is
provided as an Isabelle component, it can also provide additional tools that provided as an Isabelle component, it can also provide additional tools that
cannot be distributed via the AFP. As this repository provides a (potentially) cannot be distributed via the AFP. For more details on installing the
updated version of Isabelle/DOF, it conflicts with a complete installation of development version, please consult the
the AFP. For more details on installing the development version, please consult [README_DEVELOPMENT.md](./README_DEVELOPMENT.md) file.
the [README_DEVELOPMENT.md](./README_DEVELOPMENT.md) file.
After installing the prerequisites, change into the directory containing After installing the prerequisites, change into the directory containing
Isabelle/DOF (this should be the directory containing this ``README.md`` file) Isabelle/DOF (this should be the directory containing this ``README.md`` file)
@ -91,9 +99,9 @@ experience, as a lot of internal theories are loaded into Isabelle's editor.
## Repository Structure ## Repository Structure
The ``main`` branch of this repository is developed using the latest official The ``main`` branch of this Repository is developed using the latest official
release of Isabelle. This is also the main development branch. In addition, he release of Isabelle. This is also the main development branch. In addition, he
``[isabelle_nightly](https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/src/branch/isabelle_nightly)`` branch is used for testing Isabelle/DOF with the latest ``Isabelle_dev`` branch is used for testing Isabelle/DOF with the latest
development version of Isabelle. development version of Isabelle.
This repository is structured into several Isabelle sessions, each of which is stored This repository is structured into several Isabelle sessions, each of which is stored

Some files were not shown because too many files have changed in this diff Show More