Compare commits

...

42 Commits

Author SHA1 Message Date
Achim D. Brucker 9feeb63665 Linting ... 2024-04-26 07:00:44 +01:00
Achim D. Brucker 55e42142fa Spell Checking. 2024-04-26 02:32:25 +01:00
Achim D. Brucker 26774fc053 Merge branch 'afp_resubmission' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF into afp_resubmission 2024-04-18 09:11:26 +01:00
Burkhart Wolff 7d6048bf64 Merge branch 'afp_resubmission' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF into afp_resubmission 2024-04-16 11:11:07 +02:00
Burkhart Wolff 231892cd23 ... 2024-04-16 11:11:04 +02:00
Nicolas Méric c945da75fa Proof reading 2024-04-15 14:28:31 +02:00
Nicolas Méric b554f20a5c Move onto_morphism command to isabelle dof core 2024-04-15 12:57:42 +02:00
Burkhart Wolff 734c1953bd polishing 2024-04-12 18:52:57 +02:00
Burkhart Wolff a735e9a1f2 roughly complete section 5 on proofs over ontologies. 2024-04-12 18:42:04 +02:00
Burkhart Wolff 7c2a6099f8 section on previewer 2024-04-11 21:08:36 +02:00
Burkhart Wolff 6dfefc6b4e some elements in the proof chapter 2024-04-09 09:27:14 +02:00
Burkhart Wolff 3235410af3 some elements of chapter Proof_Ontologies. 2024-04-07 22:07:40 +02:00
Burkhart Wolff 4745c58803 some elements of chapter Proof_Ontologies. 2024-04-07 22:04:09 +02:00
Burkhart Wolff 28d1fa926e Merge branch 'main' into afp_resubmission 2024-04-07 17:19:24 +02:00
Burkhart Wolff b651116af3 nicer presentation of proofs, closer to automation. 2024-04-05 15:23:33 +02:00
Nicolas Méric 93ef94cddb Move some theorems in regexpinterface 2024-04-05 14:56:55 +02:00
Nicolas Méric 20e90f688f Merge branch 'main' into afp_resubmission 2024-04-05 14:30:03 +02:00
Burkhart Wolff 02b6d0b048 towards mechanization. 2024-04-05 13:40:44 +02:00
Burkhart Wolff f5a94ca962 completed proofs wrt. ordered Language of monitors. 2024-04-04 21:27:20 +02:00
Burkhart Wolff 6b2879d1d6 bric a brac 2024-04-04 15:52:26 +02:00
Nicolas Méric 51d93e38f8 Add constants for first monitor accepts clause 2024-04-04 12:08:35 +02:00
Nicolas Méric 7791538b54 Update to isabelle 2023 and add morphism examples 2024-04-04 12:08:25 +02:00
Burkhart Wolff dee3b47d06 some more lemmas 2024-04-04 10:08:41 +02:00
Burkhart Wolff 0bf21336f1 cleanup 2024-04-03 21:10:53 +02:00
Burkhart Wolff 2b12e53cf4 first monitor proof without sorries. 2024-04-03 20:51:21 +02:00
Burkhart Wolff c2eea7696b ... 2024-04-03 16:15:29 +02:00
Burkhart Wolff 20f163eba9 experiments 2024-04-03 15:01:25 +02:00
Burkhart Wolff 2d2cb6c8ce some experiments on monitor proofs- 2024-04-02 23:50:32 +02:00
Achim D. Brucker f61e107515 Updated scala parts to Isabelle 2023. 2024-04-02 12:36:44 +01:00
Burkhart Wolff d3aefa63b1 ... 2024-04-02 13:14:18 +02:00
Achim D. Brucker f0c379a5d2 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2024-04-02 08:34:13 +01:00
Achim D. Brucker 5d5eef1a46 Added svglov3.clo. 2024-04-02 08:33:55 +01:00
Nicolas Méric 291b2e258f Use name spaced names for docitem_name text anti-quotation 2024-03-12 18:21:53 +01:00
Nicolas Méric 10b98b660f Fix typo 2024-03-12 16:02:25 +01:00
Nicolas Méric a1677384b3 Update bug declare_reference* bug example 2024-03-11 15:51:34 +01:00
Nicolas Méric 46b094939a Update bug example in Test_Polymorphic_Classes 2024-03-04 14:32:46 +01:00
Nicolas Méric 42da18cf3a Reference a bug in polymorphic class implementation 2024-03-04 14:30:58 +01:00
Nicolas Méric 1740898171 Add message for matching error in class invariants
Give feedback for not well formed class invariants
2024-03-01 17:24:12 +01:00
Achim D. Brucker aa0a2c5f6a Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2024-02-29 17:36:18 +00:00
Achim D. Brucker a79a3f539d Added basic author and title information. 2024-02-29 17:36:08 +00:00
Achim D. Brucker ab05663738 Added stub for \inst{}. 2024-02-29 17:35:55 +00:00
Nicolas Méric 41dd3e4949 Update output latex macros name
Allow instance names compatible with binding names,
including names with subscripts
2024-02-29 10:16:51 +01:00
33 changed files with 1397 additions and 111 deletions

View File

@ -1,3 +1,4 @@
(*<*)
theory
"template-beamer-UNSUPPORTED"
imports
@ -9,5 +10,63 @@ list_templates
use_template "beamer-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
(*>*)
end
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
(*
author*[bob, email = "\<open>bob@example.com\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
*)
text\<open>
\begin{frame}
\frametitle{Example Slide}
\centering\huge This is an example!
\end{frame}
\<close>
frame*[test_frame
, frametitle = \<open>\<open>\<open>Example Slide\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close> with items @{thm "HOL.refl"}\<close>\<close>
, framesubtitle = "''Subtitle''"]
\<open>This is an example!
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> and the term encoding the title of this frame is \<^term_>\<open>frametitle @{frame \<open>test_frame\<close>}\<close>\<close>
frame*[test_frame2
, frametitle = "''Example Slide''"
, framesubtitle = \<open>\<open>\<open>Subtitle\<^sub>t\<^sub>e\<^sub>s\<^sub>t:\<close> the value of \<^term>\<open>(3::int) + 3\<close> is @{value "(3::int) + 3"}\<close>\<close>]
\<open>Test frame env \<^term>\<open>refl\<close>\<close>
frame*[test_frame3
, options = "''allowframebreaks''"
, frametitle = "''Example Slide with frame break''"
, framesubtitle = \<open>\<open>\<open>Subtitle\<^sub>t\<^sub>e\<^sub>s\<^sub>t:\<close> the value of \<^term>\<open>(3::int) + 3\<close> is @{value "(3::int) + 3"}\<close>\<close>]
\<open>
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> and the term encoding the title of this frame is \<^term_>\<open>frametitle @{frame \<open>test_frame3\<close>}\<close>
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<close>
(*<*)
end
(*>*)

View File

@ -10,4 +10,12 @@ use_template "beamerposter-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -10,4 +10,12 @@ use_template "lipics-v2021-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -10,4 +10,12 @@ use_template "lncs"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -10,4 +10,12 @@ use_template "scrartcl"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -10,4 +10,12 @@ use_template "scrreprt-modern"
list_ontologies
use_ontology "technical_report"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -10,4 +10,12 @@ use_template "scrreprt"
list_ontologies
use_ontology "technical_report"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -10,4 +10,12 @@ use_template "sn-article-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -8,3 +8,4 @@ session "template-svjour3-UNSUPPORTED" = "Isabelle_DOF-Ontologies" +
document_files
"preamble.tex"
"svjour3.cls"
"svglov3.clo"

View File

@ -0,0 +1,101 @@
% SVJour3 DOCUMENT CLASS OPTION SVGLOV3 -- for standardised journals
%
% This is an enhancement for the LaTeX
% SVJour3 document class for Springer journals
%
%%
%%
%% \CharacterTable
%% {Upper-case \A\B\C\D\E\F\G\H\I\J\K\L\M\N\O\P\Q\R\S\T\U\V\W\X\Y\Z
%% Lower-case \a\b\c\d\e\f\g\h\i\j\k\l\m\n\o\p\q\r\s\t\u\v\w\x\y\z
%% Digits \0\1\2\3\4\5\6\7\8\9
%% Exclamation \! Double quote \" Hash (number) \#
%% Dollar \$ Percent \% Ampersand \&
%% Acute accent \' Left paren \( Right paren \)
%% Asterisk \* Plus \+ Comma \,
%% Minus \- Point \. Solidus \/
%% Colon \: Semicolon \; Less than \<
%% Equals \= Greater than \> Question mark \?
%% Commercial at \@ Left bracket \[ Backslash \\
%% Right bracket \] Circumflex \^ Underscore \_
%% Grave accent \` Left brace \{ Vertical bar \|
%% Right brace \} Tilde \~}
\ProvidesFile{svglov3.clo}
[2006/02/03 v3.1
style option for standardised journals]
\typeout{SVJour Class option: svglov3.clo for standardised journals}
\def\validfor{svjour3}
\ExecuteOptions{final,10pt,runningheads}
% No size changing allowed, hence a "copy" of size10.clo is included
\renewcommand\normalsize{%
\if@twocolumn
\@setfontsize\normalsize\@xpt{12.5pt}%
\else
\if@smallext
\@setfontsize\normalsize\@xpt\@xiipt
\else
\@setfontsize\normalsize{9.5pt}{11.5pt}%
\fi
\fi
\abovedisplayskip=3 mm plus6pt minus 4pt
\belowdisplayskip=3 mm plus6pt minus 4pt
\abovedisplayshortskip=0.0 mm plus6pt
\belowdisplayshortskip=2 mm plus4pt minus 4pt
\let\@listi\@listI}
\normalsize
\newcommand\small{%
\if@twocolumn
\@setfontsize\small{8.5pt}\@xpt
\else
\if@smallext
\@setfontsize\small\@viiipt{9.5pt}%
\else
\@setfontsize\small\@viiipt{9.25pt}%
\fi
\fi
\abovedisplayskip 8.5\p@ \@plus3\p@ \@minus4\p@
\abovedisplayshortskip \z@ \@plus2\p@
\belowdisplayshortskip 4\p@ \@plus2\p@ \@minus2\p@
\def\@listi{\leftmargin\leftmargini
\parsep 0\p@ \@plus1\p@ \@minus\p@
\topsep 4\p@ \@plus2\p@ \@minus4\p@
\itemsep0\p@}%
\belowdisplayskip \abovedisplayskip
}
\let\footnotesize\small
\newcommand\scriptsize{\@setfontsize\scriptsize\@viipt\@viiipt}
\newcommand\tiny{\@setfontsize\tiny\@vpt\@vipt}
\if@twocolumn
\newcommand\large{\@setfontsize\large\@xiipt\@xivpt}
\newcommand\LARGE{\@setfontsize\LARGE{16pt}{18pt}}
\else
\newcommand\large{\@setfontsize\large\@xipt\@xiipt}
\newcommand\LARGE{\@setfontsize\LARGE{13pt}{15pt}}
\fi
\newcommand\Large{\@setfontsize\Large\@xivpt{16dd}}
\newcommand\huge{\@setfontsize\huge\@xxpt{25}}
\newcommand\Huge{\@setfontsize\Huge\@xxvpt{30}}
%
\def\runheadhook{\rlap{\smash{\lower6.5pt\hbox to\textwidth{\hrulefill}}}}
\if@twocolumn
\setlength{\textwidth}{17.4cm}
\setlength{\textheight}{234mm}
\AtEndOfClass{\setlength\columnsep{6mm}}
\else
\if@smallext
\setlength{\textwidth}{11.9cm}
\setlength{\textheight}{19.4cm}
\else
\setlength{\textwidth}{12.2cm}
\setlength{\textheight}{19.8cm}
\fi
\fi
%
\AtBeginDocument{%
\@ifundefined{@journalname}
{\typeout{Unknown journal: specify \string\journalname\string{%
<name of your journal>\string} in preambel^^J}}{}}
%
\endinput
%%
%% End of file `svglov3.clo'.

View File

@ -10,4 +10,12 @@ use_template "svjour3-UNSUPPORTED"
list_ontologies
use_ontology "scholarly_paper"
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
author*[alice, email = "\<open>alice@example.com\<close>",
http_site = "\<open>https://example.com/alice\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
author*[bob, email = "\<open>bob@example.com\<close>",
http_site = "\<open>https://example.com/bob\<close>",
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
end

View File

@ -58,7 +58,7 @@
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\renewcommand{\DOFauthor}{}
\renewcommand{\DOFinstitute}{}
\expandafter\newcommand\csname 2authand\endcsname{}
%\expandafter\newcommand\csname 2authand\endcsname{}
\expandafter\newcommand\csname 3authand\endcsname{}
\expandafter\newcommand\csname 4authand\endcsname{}

View File

@ -26,6 +26,7 @@
\bibliographystyle{sn-basic}
\let\proof\relax
\let\endproof\relax
\newcommand{\inst}[1]{}%
\newcommand{\institute}[1]{}
\usepackage{manyfoot}
\usepackage{DOF-core}

View File

@ -27,7 +27,7 @@
\usepackage{DOF-core}
\usepackage{mathptmx}
\bibliographystyle{abbrvnat}
\newcommand{\inst}[1]{}%
\usepackage[numbers, sort&compress, sectionbib]{natbib}
\usepackage{hyperref}

View File

@ -37,13 +37,14 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark
xstring_opt:(xstring * Position.T) option),
toks_list:Input.source list)
: theory -> theory =
let val ((binding,cid_pos), doc_attrs) = meta_args
let val toplvl = Toplevel.make_state o SOME
val ((binding,cid_pos), doc_attrs) = meta_args
val oid = Binding.name_of binding
val oid' = if meta_args = ODL_Meta_Args_Parser.empty_meta_args
then "output"
else oid
(* as side-effect, generates markup *)
fun check_n_tex_text thy toks = let val ctxt = Toplevel.presentation_context (Toplevel.make_state (SOME thy))
fun check_n_tex_text thy toks = let val ctxt = Toplevel.presentation_context (toplvl thy)
val pos = Input.pos_of toks;
val _ = Context_Position.reports ctxt
[(pos, Markup.language_document (Input.is_delimited toks)),

View File

@ -693,4 +693,27 @@ lemma*[e5::E] testtest : "xx + testtest_level = yy + testtest_level \<Longrighta
text\<open>Indeed this fails:\<close>
(*lemma*[e6::E] testtest : "xx + the (level @{test2 \<open>testtest2''\<close>}) = yy + the (level @{test2 \<open>testtest2''\<close>}) \<Longrightarrow> xx = yy" by simp*)
text\<open>Bug: Checking of \<^theory_text>\<open>text*\<close> type against \<^theory_text>\<open>declare_reference*\<close> is not done.
Should be compatible with type unification mechanism. See the record package\<close>
doc_class 'a AAA_test =
aaa::"'a list"
doc_class 'a BBB_test =
bbb::"'a list"
declare_reference*[aaa_test::"'a::one AAA_test"]
text\<open>@{AAA_test (unchecked) \<open>aaa_test\<close>}\<close>
text\<open>\<open>aaa_test\<close> should fail and trigger an error because its type was \<^typ>\<open>'a::one AAA_test\<close>
in the \<^theory_text>\<open>declare_reference*\<close> command:\<close>
(*text*[aaa_test::"'a::one BBB_test"]\<open>\<close>*)
text*[aaa_test::"int AAA_test"]\<open>\<close>
text\<open>\<open>aaa_test'\<close> should fail and trigger an error because its type \<^typ>\<open>string AAA_test\<close>
is not compatible with its type \<^typ>\<open>'a::one AAA_test\<close> declared in
the \<^theory_text>\<open>declare_reference*\<close> command:\<close>
text*[aaa_test'::"string AAA_test"]\<open>\<close>
end

View File

@ -22,8 +22,10 @@ session "Isabelle_DOF" (AFP) = "Functional-Automata" +
"thys/manual/M_01_Introduction"
"thys/manual/M_02_Background"
"thys/manual/M_03_GuidedTour"
"thys/manual/M_04_RefMan"
"thys/manual/M_05_Implementation"
"thys/manual/M_04_Document_Ontology"
"thys/manual/M_05_Proofs_Ontologies"
"thys/manual/M_06_RefMan"
"thys/manual/M_07_Implementation"
"thys/manual/Isabelle_DOF_Manual"
document_files
"root.bib"
@ -51,6 +53,7 @@ session "Isabelle_DOF" (AFP) = "Functional-Automata" +
"figures/doc-mod-onto-docinst.pdf"
"figures/doc-mod-DOF.pdf"
"figures/doc-mod-term-aq.pdf"
"figures/ThisPaperWithPreviewer.png"
export_classpath

View File

@ -2,6 +2,8 @@
\input{M_01_Introduction.tex}
\input{M_02_Background.tex}
\input{M_03_GuidedTour.tex}
\input{M_04_RefMan.tex}
\input{M_05_Implementation.tex}
\input{M_04_Document_Ontology.tex}
\input{M_05_Proofs_Ontologies.tex}
\input{M_06_RefMan.tex}
\input{M_07_Implementation.tex}
\input{Isabelle_DOF_Manual.tex}

Binary file not shown.

After

Width:  |  Height:  |  Size: 541 KiB

View File

@ -10,6 +10,39 @@
year = 2015
}
@
Book{ books/daglib/0032976,
added-at = {2014-03-12T00:00:00.000+0100},
author = {Euzenat, J{\~A}<7D>r{\~A}<7D>me and Shvaiko, Pavel},
biburl = {https://www.bibsonomy.org/bibtex/28d5372a81f181d9d5a761ca12209cf39/dblp},
interhash = {fc55a5b84d114e38db0a0303cc1bd7da},
intrahash = {8d5372a81f181d9d5a761ca12209cf39},
isbn = {978-3-642-38720-3},
keywords = {dblp},
pages = {I-XVII, 1--511},
publisher = {Springer},
timestamp = {2015-06-18T09:49:52.000+0200},
title = {Ontology Matching, Second Edition.},
year = 2013,
doi = {10.1007/978-3-642-38721-0}
}
@Misc{ atl,
title = {{ATL} -- A model transformation technology},
url = {https://www.eclipse.org/atl/},
author = {{Eclipse Foundation}},
}
@InProceedings{ BGPP95,
author = {Yamine A{\"{\i}}t Ameur and Frederic Besnard and Patrick Girard and Guy Pierra and Jean{-}Claude
Potier},
title = {Formal Specification and Metaprogramming in the {EXPRESS} Language},
booktitle = {The 7th International Conference on Software Engineering and Knowledge Engineering (SEKE)},
pages = {181--188},
publisher = {Knowledge Systems Institute},
year = 1995
}
@Misc{ ibm:doors:2019,
author = {IBM},
title = {{IBM} Engineering Requirements Management {DOORS} Family},

View File

@ -590,15 +590,15 @@ text \<open>underlying idea: a monitor class automatically receives a
doc_class article =
style_id :: string <= "''LNCS''"
version :: "(int \<times> int \<times> int)" <= "(0,0,0)"
accepts "(title ~~
\<lbrakk>subtitle\<rbrakk> ~~
\<lbrace>author\<rbrace>\<^sup>+ ~~
abstract ~~
\<lbrace>introduction\<rbrace>\<^sup>+ ~~
\<lbrace>background\<rbrace>\<^sup>* ~~
\<lbrace>technical || example \<rbrace>\<^sup>+ ~~
\<lbrace>conclusion\<rbrace>\<^sup>+ ~~
bibliography ~~
accepts "(title ~~
\<lbrakk>subtitle\<rbrakk> ~~
\<lbrace>author\<rbrace>\<^sup>+ ~~
abstract ~~
\<lbrace>introduction\<rbrace>\<^sup>+ ~~
\<lbrace>background\<rbrace>\<^sup>* ~~
\<lbrace>technical || example || float \<rbrace>\<^sup>+ ~~
\<lbrace>conclusion\<rbrace>\<^sup>+ ~~
bibliography ~~
\<lbrace>annex\<rbrace>\<^sup>* )"
@ -662,6 +662,7 @@ let val cidS = ["scholarly_paper.introduction","scholarly_paper.technical",
in DOF_core.add_ml_invariant binding (DOF_core.make_ml_invariant (body, cid_long)) thy end)
\<close>
term\<open>float\<close>
section\<open>Miscelleous\<close>
subsection\<open>Common Abbreviations\<close>

View File

@ -75,10 +75,134 @@ doc_class report =
abstract ~~
\<lbrakk>table_of_contents\<rbrakk> ~~
\<lbrace>introduction\<rbrace>\<^sup>+ ~~
\<lbrace>technical || figure \<rbrace>\<^sup>+ ~~
\<lbrace>background\<rbrace>\<^sup>* ~~
\<lbrace>technical || example || float \<rbrace>\<^sup>+ ~~
\<lbrace>conclusion\<rbrace>\<^sup>+ ~~
\<lbrace>index\<rbrace>\<^sup>* ~~
bibliography)"
bibliography ~~
\<lbrakk>index\<rbrakk> ~~ \<lbrace>annex\<rbrace>\<^sup>* )"
section\<open>Experimental\<close>
(* switch on regexp syntax *)
notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
notation Plus (infixr "||" 55)
notation Times (infixr "~~" 60)
notation Atom ("\<lfloor>_\<rfloor>" 65)
text\<open>Not a terribly deep theorem, but an interesting property of consistency between
ontologies - so, a lemma that shouldn't break if the involved ontologies were changed.
It reads as follows:
"The structural language of articles should be included in the structural language of
reports, that is to say, reports should just have a richer structure than ordinary papers." \<close>
theorem articles_sub_reports: \<open>(Lang article_monitor) \<subseteq> Lang report_monitor\<close>
unfolding article_monitor_def report_monitor_def
apply(rule regexp_seq_mono[OF subset_refl] | rule seq_cancel_opt | rule subset_refl)+
done
text\<open>The proof proceeds by blindly applying the monotonicity rules
on the language of regular expressions.\<close>
text\<open>All Class-Id's --- should be generated.\<close>
lemmas class_ids =
SML_def code_def annex_def title_def figure_def chapter_def article_def theorem_def
paragraph_def tech_code_def assumption_def definition_def hypothesis_def
eng_example_def text_element_def math_content_def tech_example_def subsubsection_def
engineering_content_def data_def float_def axiom_def LATEX_def author_def listing_def
abstract_def assertion_def technical_def background_def evaluation_def math_proof_def
math_formal_def bibliography_def math_example_def text_section_def conclusion_stmt_def
math_explanation_def ISAR_def frame_def lemma_def index_def report_def section_def
subtitle_def corollary_def subsection_def conclusion_def experiment_def consequence_def
proposition_def introduction_def related_work_def front_matter_def math_motivation_def
example_def table_of_contents_def tech_definition_def premise_def
definition allClasses
where \<open>allClasses \<equiv>
{SML, code, annex, title,figure,chapter, article, theorem, paragraph,
tech_code, assumption, definition, hypothesis, eng_example, text_element,
math_content,tech_example, subsubsection,tech_definition,
engineering_content,data,float,axiom,LATEX,author,listing, example,abstract,
assertion,technical,background,evaluation,math_proof,math_formal,bibliography,
math_example,text_section,conclusion_stmt,math_explanation,ISAR,frame,
lemma,index,report,section,premise,subtitle,corollary,subsection,conclusion,
experiment, consequence,proposition,introduction,related_work,front_matter,
math_motivation,table_of_contents}\<close>
text\<open>A rudimentary fragment of the class hierarchy re-modeled on classid's :\<close>
definition cid_of where \<open>cid_of = inv Regular_Exp.Atom\<close>
lemma Atom_inverse[simp]:\<open>cid_of (Regular_Exp.Atom a) = a\<close>
unfolding cid_of_def by (meson UNIV_I f_inv_into_f image_eqI rexp.inject(1))
definition doc_class_rel
where \<open>doc_class_rel \<equiv> {(cid_of proposition,cid_of math_content),
(cid_of listing,cid_of float),
(cid_of figure,cid_of float)} \<close>
instantiation "doc_class" :: ord
begin
definition
less_eq_doc_class: "x \<le> y \<longleftrightarrow> (x,y) \<in> doc_class_rel\<^sup>*"
definition
less_doc_class: "(x::doc_class) < y \<longleftrightarrow> (x \<le> y \<and> \<not> y \<le> x)"
instance ..
end
lemma drc_acyclic : "acyclic doc_class_rel"
proof -
let ?measure = "(\<lambda>x.3::int)(cid_of float := 0, cid_of math_content := 0,
cid_of listing := 1, cid_of figure := 1, cid_of proposition := 1)"
show ?thesis
unfolding doc_class_rel_def
apply(rule acyclicI_order [where f = "?measure"])
by(simp only: class_ids)(auto)
qed
instantiation "doc_class" :: order
begin
instance
proof
fix x::"doc_class"
show \<open>x \<le> x\<close>
unfolding less_eq_doc_class by simp
next
fix x y z:: "doc_class"
show \<open>x \<le> y \<Longrightarrow> y \<le> z \<Longrightarrow> x \<le> z\<close>
unfolding less_eq_doc_class
by force
next
fix x y::"doc_class"
have * : "antisym (doc_class_rel\<^sup>*)"
by (simp add: acyclic_impl_antisym_rtrancl drc_acyclic)
show \<open>x \<le> y \<Longrightarrow> y \<le> x \<Longrightarrow> x = y\<close>
apply(insert antisymD[OF *])
using less_eq_doc_class by auto
next
fix x y::"doc_class"
show \<open>(x < y) = (x \<le> y \<and> \<not> y \<le> x)\<close>
by(simp add: less_doc_class)
qed
end
theorem articles_Lsub_reports: \<open>(L\<^sub>s\<^sub>u\<^sub>b article_monitor) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b report_monitor\<close>
unfolding article_monitor_def report_monitor_def
by (meson order_refl regexp_seq_mono' seq_cancel_opt')
end

View File

@ -36,9 +36,12 @@ theory Isa_DOF (* Isabelle Document Ontology Framework *)
and "open_monitor*" "close_monitor*"
"declare_reference*" "update_instance*"
"doc_class" "onto_class" (* a syntactic alternative *)
"ML*"
"doc_class" "onto_class" (* a syntactic alternative *)
"onto_morphism" :: thy_decl
and "to"
and "ML*"
"define_shortcut*" "define_macro*" :: thy_decl
and "definition*" :: thy_defn
and "theorem*" "lemma*" "corollary*" "proposition*" :: thy_goal_stmt
and "schematic_goal*" :: thy_goal_stmt
@ -74,6 +77,7 @@ val def_suffixN = "_" ^ defN
val defsN = defN ^ "s"
val instances_of_suffixN = "_instances"
val invariant_suffixN = "_inv"
val monitor_suffixN = "_monitor"
val instance_placeholderN = "\<sigma>"
val makeN = "make"
val schemeN = "_scheme"
@ -856,8 +860,13 @@ val check_closing_ml_invs =
check_invs get_closing_ml_invariants closing_ml_invariant_class_of closing_ml_invariant_check_of
(* Output name for LaTeX macros.
Also rewrite "." to allow macros with objects long names *)
val output_name = Latex.output_name o translate_string (fn "." => "DOT" | s => s)
Also rewrite "." to allow macros with objects long names
and "\<^sub>" allowed in name bindings and then in instance names *)
val output_name = Symbol.explode
#> map (fn s => if equal s Symbol.sub then "SUB" else s)
#> implode
#> translate_string (fn "." => "DOT" | s => s)
#> Latex.output_name
val ISA_prefix = "Isabelle_DOF_"
@ -1816,7 +1825,12 @@ fun check_invariants thy binding =
let val ctxt = Proof_Context.init_global thy
val trivial_true = \<^term>\<open>True\<close> |> HOLogic.mk_Trueprop |> Thm.cterm_of ctxt |> Thm.trivial
val evaluated_term = value ctxt term
handle ERROR e =>
handle Match => error ("exception Match raised when checking "
^ inv_name ^ " invariant." ^ Position.here pos ^ "\n"
^ "You might want to check the definition of the invariant\n"
^ "against the value specified in the instance\n"
^ "or the default value declared in the instance class.")
| ERROR e =>
if (String.isSubstring "Wellsortedness error" e)
andalso (Config.get_global thy DOF_core.invariants_checking_with_tactics)
then (warning("Invariants checking uses proof tactics");
@ -2349,6 +2363,7 @@ fun onto_macro_cmd_command (name, pos) descr cmd output_cmd =
))))
(* Core Command Definitions *)
val _ =
@ -2681,7 +2696,9 @@ fun gen_theorem schematic bundle_includes prep_att prep_stmt
|> prep_statement (prep_att lthy) prep_stmt elems raw_concl;
val atts = more_atts @ map (prep_att lthy) raw_atts;
val print_cfg = {interactive = int, pos = Position.thread_data (), proof_state= false}
val pos = Position.thread_data ();
val print_results =
Proof_Display.print_results {interactive = int, pos = pos, proof_state = false};
fun after_qed' results goal_ctxt' =
let
@ -2693,13 +2710,13 @@ fun gen_theorem schematic bundle_includes prep_att prep_stmt
Local_Theory.notes_kind kind
(map2 (fn (b, _) => fn ths => (b, [(ths, [])])) stmt results') lthy;
val lthy'' =
if Binding.is_empty_atts (name, atts) then
(Proof_Display.print_results print_cfg lthy' ((kind, ""), res); lthy')
if Binding.is_empty_atts (name, atts)
then (print_results lthy' ((kind, ""), res); lthy')
else
let
val ([(res_name, _)], lthy'') =
Local_Theory.notes_kind kind [((name, atts), [(maps #2 res, [])])] lthy';
val _ = Proof_Display.print_results print_cfg lthy' ((kind, res_name), res);
val _ = print_results lthy' ((kind, res_name), res);
in lthy'' end;
in after_qed results' lthy'' end;
@ -2900,6 +2917,7 @@ fun compute_trace_ML ctxt oid pos_opt pos' =
val parse_oid = Scan.lift(Parse.position Args.name)
val parse_cid = Scan.lift(Parse.position Args.name)
val parse_oid' = Term_Style.parse -- parse_oid
val parse_oid'' = Scan.lift(Parse.embedded_position)
val parse_cid' = Term_Style.parse -- parse_cid
@ -2944,13 +2962,6 @@ fun compute_cid_repr ctxt cid _ =
let val _ = DOF_core.get_onto_class_name_global cid (Proof_Context.theory_of ctxt)
in Const(cid,dummyT) end
fun compute_name_repr ctxt oid pos =
let val instances = DOF_core.get_instances ctxt
val markup = DOF_core.get_instance_name_global oid (Proof_Context.theory_of ctxt)
|> Name_Space.markup (Name_Space.space_of_table instances)
val _ = Context_Position.report ctxt pos markup;
in HOLogic.mk_string oid end
local
fun pretty_attr_access_style ctxt (style, ((attr,pos),(oid,pos'))) =
@ -2960,8 +2971,15 @@ fun pretty_trace_style ctxt (style, (oid,pos)) =
Document_Output.pretty_term ctxt (style (ISA_core.compute_attr_access (Context.Proof ctxt)
traceN oid NONE pos));
fun pretty_name_style ctxt (style, (oid,pos)) =
Document_Output.pretty_term ctxt (style (compute_name_repr ctxt oid pos))
fun pretty_name_style ctxt (oid,pos) =
let
val instances = DOF_core.get_instances ctxt
val ns = Name_Space.space_of_table instances
val name = DOF_core.get_instance_name_global oid (Proof_Context.theory_of ctxt)
val ctxt' = Config.put Name_Space.names_unique true ctxt
val _ = name |> Name_Space.markup ns
|> Context_Position.report ctxt pos
in Name_Space.pretty ctxt' ns name end
fun pretty_cid_style ctxt (style, (cid,pos)) =
Document_Output.pretty_term ctxt (style (compute_cid_repr ctxt cid pos));
@ -2974,6 +2992,7 @@ fun context_position_parser parse_con (ctxt, toks) =
val (res, (ctxt', toks')) = parse_con (ctxt, toks)
in ((res,pos),(ctxt', toks')) end
val parse_cid0 = parse_cid
val parse_cid = (context_position_parser Args.typ_abbrev)
>> (fn (Type(ss,_),pos) => (pos,ss)
|( _,pos) => ISA_core.err "Undefined Class Id" pos);
@ -2985,6 +3004,11 @@ fun pretty_cid_style ctxt (_,(pos,cid)) =
(*reconversion to term in order to have access to term print options like: names_short etc...) *)
Document_Output.pretty_term ctxt ((compute_cid_repr ctxt cid pos));
fun get_class_2_ML ctxt (str,_) =
let val thy = Context.theory_of ctxt
val DOF_core.Onto_Class S = (DOF_core.get_onto_class_global' str thy)
in @{make_string} S end
in
val _ = Theory.setup
(ML_Antiquotation.inline \<^binding>\<open>docitem\<close>
@ -2995,11 +3019,13 @@ val _ = Theory.setup
(fn (ctxt,toks) => (parse_oid >> trace_attr_2_ML ctxt) (ctxt, toks)) #>
ML_Antiquotation.inline \<^binding>\<open>docitem_name\<close>
(fn (ctxt,toks) => (parse_oid >> get_instance_name_2_ML ctxt) (ctxt, toks)) #>
ML_Antiquotation.inline \<^binding>\<open>doc_class\<close>
(fn (ctxt,toks) => (parse_cid0 >> get_class_2_ML ctxt) (ctxt, toks)) #>
basic_entity \<^binding>\<open>trace_attribute\<close> parse_oid' pretty_trace_style #>
basic_entity \<^binding>\<open>doc_class\<close> parse_cid' pretty_cid_style #>
basic_entity \<^binding>\<open>onto_class\<close> parse_cid' pretty_cid_style #>
basic_entity \<^binding>\<open>docitem_attribute\<close> parse_attribute_access' pretty_attr_access_style #>
basic_entity \<^binding>\<open>docitem_name\<close> parse_oid' pretty_name_style
basic_entity \<^binding>\<open>docitem_name\<close> parse_oid'' pretty_name_style
)
end
end
@ -3063,6 +3089,18 @@ fun define_inv (bind, inv) thy =
|> Syntax.check_term (Proof_Context.init_global thy)
in thy |> Named_Target.theory_map (define_cond bind eq) end
fun define_monitor_const doc_class_name bind thy =
let val bname = Long_Name.base_name doc_class_name
val rex = DOF_core.rex_of doc_class_name thy
val monitor_binding = bind |> (Binding.qualify false bname
#> Binding.suffix_name monitor_suffixN)
in
if can hd rex
then
let val eq = Logic.mk_equals (Free(Binding.name_of monitor_binding, doc_class_rexp_T), hd rex)
in thy |> Named_Target.theory_map (define_cond monitor_binding eq) end
else thy
end
fun add_doc_class_cmd overloaded (raw_params, binding)
raw_parent raw_fieldsNdefaults reject_Atoms regexps invariants thy =
@ -3133,9 +3171,10 @@ fun add_doc_class_cmd overloaded (raw_params, binding)
| SOME _ => if (not o null) record_fields
then add record_fields invariants' {virtual=false}
else add [DOF_core.tag_attr] invariants' {virtual=true})
|> (fn thy => OntoLinkParser.docitem_antiquotation binding (cid thy) thy)
(* defines the ontology-checked text antiquotation to this document class *)
|> (fn thy => OntoLinkParser.docitem_antiquotation binding (cid thy) thy)
|> (fn thy => fold define_inv (invariants') thy)
|> (fn thy => define_monitor_const (cid thy) binding thy)
(* The function declare_ISA_class_accessor_and_check_instance uses a prefix
because the class name is already bound to "doc_class Regular_Exp.rexp" constant
by add_doc_class_cmd function *)
@ -3175,6 +3214,76 @@ val _ =
val clean_mixfix_sub = translate_string
(fn "\<^sub>_" => "_"
| "\<^sub>'" => "'"
| c => c);
val prefix_sub = prefix "\<^sub>"
val convertN = "convert"
fun add_onto_morphism classes_mappings eqs thy =
if (length classes_mappings = length eqs) then
let
val specs = map (fn x => (Binding.empty_atts, x)) eqs
val converts =
map (fn (oclasses, dclass) =>
let
val oclasses_string = map YXML.content_of oclasses
val dclass_string = YXML.content_of dclass
val const_sub_name = dclass_string
|> (oclasses_string |> fold_rev (fn x => fn y => x ^ "_" ^ y))
|> String.explode |> map (fn x => "\<^sub>" ^ (String.str x)) |> String.concat
val convert_typ = oclasses_string |> rev |> hd
|> (oclasses_string |> rev |> tl |> fold (fn x => fn y => x ^ " \<times> " ^ y))
val convert_typ' = convert_typ ^ " \<Rightarrow> " ^ dclass_string
val oclasses_sub_string = oclasses_string
|> map (clean_mixfix_sub
#> String.explode
#> map (prefix_sub o String.str)
#> String.concat)
val mixfix = oclasses_sub_string |> rev |> hd
|> (oclasses_sub_string |> rev |> tl |> fold (fn x => fn y => x ^ "\<^sub>\<times>" ^ y))
|> ISA_core.clean_mixfix
val mixfix' = convertN ^ mixfix ^ "\<^sub>\<Rightarrow>"
^ (dclass_string |> clean_mixfix_sub |> String.explode
|> map (prefix_sub o String.str) |> String.concat)
in SOME (Binding.name (convertN ^ const_sub_name), SOME convert_typ', Mixfix.mixfix mixfix') end)
classes_mappings
val args = map (fn (x, y) => (x, y, [], [])) (converts ~~ specs)
val lthy = Named_Target.theory_init thy
val updated_lthy = fold (fn (decl, spec, prems, params) => fn lthy =>
let
val (_, lthy') = Specification.definition_cmd decl params prems spec true lthy
in lthy' end) args lthy
in Local_Theory.exit_global updated_lthy end
(* alternative way to update the theory using the Theory.join_theory function *)
(*val lthys = map (fn (decl, spec, prems, params) =>
let
val (_, lthy') = Specification.definition_cmd decl params prems spec true lthy
in lthy' end) args
val thys = map (Local_Theory.exit_global) lthys
in Theory.join_theory thys end*)
else error("The number of morphisms declarations does not match the number of definitions")
fun add_onto_morphism' (classes_mappings, eqs) = add_onto_morphism classes_mappings eqs
val parse_onto_morphism = Parse.and_list
((Parse.$$$ "(" |-- Parse.enum1 "," Parse.typ --| Parse.$$$ ")" --| \<^keyword>\<open>to\<close>)
-- Parse.typ)
-- (\<^keyword>\<open>where\<close> |-- Parse.and_list Parse.prop)
(* The name of the definitions must follow this rule:
for the declaration "onto_morphism (AA, BB) to CC",
the name of the constant must be "convert\<^sub>A\<^sub>A\<^sub>\<times>\<^sub>B\<^sub>B\<^sub>\<Rightarrow>\<^sub>C\<^sub>C". *)
val _ =
Outer_Syntax.command \<^command_keyword>\<open>onto_morphism\<close> "define ontology morpism"
(parse_onto_morphism >> (Toplevel.theory o add_onto_morphism'));
end (* struct *)
\<close>

View File

@ -236,6 +236,72 @@ end; (* local *)
end (* struct *)
\<close>
lemma regexp_sub : "a \<le> b \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (\<lfloor>a\<rfloor>) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (\<lfloor>b\<rfloor>)"
using dual_order.trans by auto
lemma regexp_seq_mono:
"Lang(a) \<subseteq> Lang (a') \<Longrightarrow> Lang(b) \<subseteq> Lang (b') \<Longrightarrow> Lang(a ~~ b) \<subseteq> Lang(a' ~~ b')" by auto
lemma regexp_seq_mono':
"L\<^sub>s\<^sub>u\<^sub>b(a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (a') \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b(b) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (b') \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b(a ~~ b) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b(a' ~~ b')" by auto
lemma regexp_alt_mono :"Lang(a) \<subseteq> Lang (a') \<Longrightarrow> Lang(a || b) \<subseteq> Lang(a' || b)" by auto
lemma regexp_alt_mono' :"L\<^sub>s\<^sub>u\<^sub>b(a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (a') \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b(a || b) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b(a' || b)" by auto
lemma regexp_alt_commute : "Lang(a || b) = Lang(b || a)" by auto
lemma regexp_alt_commute' : "L\<^sub>s\<^sub>u\<^sub>b(a || b) = L\<^sub>s\<^sub>u\<^sub>b(b || a)" by auto
lemma regexp_unit_right : "Lang (a) = Lang (a ~~ One) " by simp
lemma regexp_unit_right' : "L\<^sub>s\<^sub>u\<^sub>b (a) = L\<^sub>s\<^sub>u\<^sub>b (a ~~ One) " by simp
lemma regexp_unit_left : "Lang (a) = Lang (One ~~ a) " by simp
lemma regexp_unit_left' : "L\<^sub>s\<^sub>u\<^sub>b (a) = L\<^sub>s\<^sub>u\<^sub>b (One ~~ a) " by simp
lemma opt_star_incl :"Lang (opt a) \<subseteq> Lang (Star a)" by (simp add: opt_def subset_iff)
lemma opt_star_incl':"L\<^sub>s\<^sub>u\<^sub>b (opt a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star a)" by (simp add: opt_def subset_iff)
lemma rep1_star_incl:"Lang (rep1 a) \<subseteq> Lang (Star a)"
unfolding rep1_def by(subst L_Star, subst L_Conc)(force)
lemma rep1_star_incl':"L\<^sub>s\<^sub>u\<^sub>b (rep1 a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star a)"
unfolding rep1_def by(subst L\<^sub>s\<^sub>u\<^sub>b_Star, subst L\<^sub>s\<^sub>u\<^sub>b_Conc)(force)
lemma cancel_rep1 : "Lang (a) \<subseteq> Lang (rep1 a)"
unfolding rep1_def by auto
lemma cancel_rep1' : "L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (rep1 a)"
unfolding rep1_def by auto
lemma seq_cancel_opt : "Lang (a) \<subseteq> Lang (c) \<Longrightarrow> Lang (a) \<subseteq> Lang (opt b ~~ c)"
by(subst regexp_unit_left, rule regexp_seq_mono)(simp_all add: opt_def)
lemma seq_cancel_opt' : "L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (c) \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (opt b ~~ c)"
by(subst regexp_unit_left', rule regexp_seq_mono')(simp_all add: opt_def)
lemma seq_cancel_Star : "Lang (a) \<subseteq> Lang (c) \<Longrightarrow> Lang (a) \<subseteq> Lang (Star b ~~ c)"
by auto
lemma seq_cancel_Star' : "L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (c) \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star b ~~ c)"
by auto
lemma mono_Star : "Lang (a) \<subseteq> Lang (b) \<Longrightarrow> Lang (Star a) \<subseteq> Lang (Star b)"
by(auto)(metis in_star_iff_concat order.trans)
lemma mono_Star' : "L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (b) \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (Star a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star b)"
by(auto)(metis in_star_iff_concat order.trans)
lemma mono_rep1_star:"Lang (a) \<subseteq> Lang (b) \<Longrightarrow> Lang (rep1 a) \<subseteq> Lang (Star b)"
using mono_Star rep1_star_incl by blast
lemma mono_rep1_star':"L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (b) \<Longrightarrow> L\<^sub>s\<^sub>u\<^sub>b (rep1 a) \<subseteq> L\<^sub>s\<^sub>u\<^sub>b (Star b)"
using mono_Star' rep1_star_incl' by blast
no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
no_notation Plus (infixr "||" 55)
no_notation Times (infixr "~~" 60)

View File

@ -13,7 +13,7 @@
(*<*)
theory "Isabelle_DOF_Manual"
imports "M_05_Implementation"
imports "M_07_Implementation"
begin
close_monitor*[this]
check_doc_global

View File

@ -124,7 +124,7 @@ abstract*[abs, keywordlist="[\<open>Ontology\<close>, \<open>Ontological Modelin
A major application of \<^dof> is the integrated development of
formal certification documents (\<^eg>, for Common Criteria or CENELEC
50128) that require consistency across both formal and informal
arguments.
arguments.
\<^isadof> is integrated into Isabelle's IDE, which
allows for smooth ontology development as well as immediate

View File

@ -24,17 +24,17 @@ The linking of the \<^emph>\<open>formal\<close> to the \<^emph>\<open>informal\
digitization of knowledge and its propagation. This challenge incites numerous research efforts
summarized under the labels ``semantic web,'' ``data mining,'' or any form of advanced ``semantic''
text processing. A key role in structuring this linking plays is \<^emph>\<open>document ontologies\<close> (also called
\<^emph>\<open>vocabulary\<close> in the semantic web community~@{cite "w3c:ontologies:2015"}), \<^ie>, a machine-readable
\<^emph>\<open>vocabulary\<close> in the semantic web community~\<^cite>\<open>"w3c:ontologies:2015"\<close>), \<^ie>, a machine-readable
form of the structure of documents as well as the document discourse.
Such ontologies can be used for the scientific discourse within scholarly articles, mathematical
libraries, and in the engineering discourse of standardized software certification
documents~@{cite "boulanger:cenelec-50128:2015" and "cc:cc-part3:2006"}. All these
documents~\<^cite>\<open>"boulanger:cenelec-50128:2015" and "cc:cc-part3:2006"\<close>. All these
documents contain formal content and have to follow a given structure. In practice, large groups of developers have to produce a substantial
set of documents where consistency is notoriously difficult to maintain. In particular,
set of documents where consistency is notoriously difficult to maintain. In particular,
certifications are centred around the \<^emph>\<open>traceability\<close> of requirements throughout the entire
set of documents. While technical solutions for the traceability problem exist (most notably:
DOORS~@{cite "ibm:doors:2019"}), they are weak in the treatment of formal entities (such as formulas
DOORS~\<^cite>\<open>"ibm:doors:2019"\<close>), they are weak in the treatment of formal entities (such as formulas
and their logical contexts).
Further applications are the domain-specific discourse in juridical texts or medical reports.
@ -53,7 +53,7 @@ the Document Ontology Framework (\<^dof>) and an implementation of \<^dof> calle
provers. \<^isadof> is an instance of this novel framework, implemented as an extension of Isabelle/HOL,
to \<^emph>\<open>model\<close> typed ontologies and to \<^emph>\<open>enforce\<close> them during document evolution. Based on Isabelle's
infrastructures, ontologies may refer to types, terms, proven theorems, code, or established
assertions. Based on a novel adaption of the Isabelle IDE (called PIDE, @{cite "wenzel:asynchronous:2014"}),
assertions. Based on a novel adaption of the Isabelle IDE (called PIDE, \<^cite>\<open>"wenzel:asynchronous:2014"\<close>),
a document is checked to be \<^emph>\<open>conform\<close> to a particular ontology---\<^isadof> is designed to give fast
user-feedback \<^emph>\<open>during the capture of content\<close>. This is particularly valuable in the case of document
evolution, where the \<^emph>\<open>coherence\<close> between the formal and the informal parts of the content can
@ -63,9 +63,9 @@ To avoid any misunderstanding: \<^isadof> is \<^emph>\<open>not a theory in HOL
track and trace links in texts. It is an \<^emph>\<open>environment to write structured text\<close> which
\<^emph>\<open>may contain\<close> Isabelle/HOL definitions and proofs like mathematical articles, tech-reports and
scientific papers---as the present one, which is written in \<^isadof> itself. \<^isadof> is a plugin
into the Isabelle/Isar framework in the style of~@{cite "wenzel.ea:building:2007"}. However,
into the Isabelle/Isar framework in the style of~\<^cite>\<open>"wenzel.ea:building:2007"\<close>. However,
\<^isadof> will generate from ontologies a theory infrastructure consisting of types, terms, theorems
and code that allows both interactive checking as well as formal reasoning over meta-data
and code that allows both interactive checking and formal reasoning over meta-data
related to annotated documents.\<close>
subsubsection\<open>How to Read This Manual\<close>
@ -113,7 +113,7 @@ CHANGELOG.md CITATION examples install LICENSE README.md ROOTS src\<close
subsubsection\<open>How to Cite \<^isadof>\<close>
text\<open>
If you use or extend \<^isadof> in your publications, please use
\<^item> for the \<^isadof> system~@{cite "brucker.ea:isabelle-ontologies:2018"}:
\<^item> for the \<^isadof> system~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>:
\begin{quote}\small
A.~D. Brucker, I.~Ait-Sadoune, N. Méric, and B.~Wolff. Using Deep Ontologies in Formal
Software Engineering. In \<^emph>\<open>International Conference on Rigorous State-Based Methods (ABZ 2023)\<close>,
@ -122,7 +122,7 @@ text\<open>
\end{quote}
A \<^BibTeX>-entry is available at:
\<^url>\<open>https://www.lri.fr/~wolff/bibtex/wolff.html\<close>.
\<^item> an older description of the system~@{cite "brucker.ea:isabelle-ontologies:2018"}:
\<^item> an older description of the system~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>:
\begin{quote}\small
A.~D. Brucker, I.~Ait-Sadoune, P.~Crisafulli, and B.~Wolff. Using the {Isabelle} ontology
framework: Linking the formal with the informal. In \<^emph>\<open>Conference on Intelligent Computer
@ -132,7 +132,7 @@ text\<open>
\end{quote}
A \<^BibTeX>-entry is available at:
\<^url>\<open>https://www.brucker.ch/bibliography/abstract/brucker.ea-isabelle-ontologies-2018\<close>.
\<^item> for the implementation of \<^isadof>~@{cite "brucker.ea:isabelledof:2019"}:
\<^item> for the implementation of \<^isadof>~\<^cite>\<open>"brucker.ea:isabelledof:2019"\<close>:
\begin{quote}\small
A.~D. Brucker and B.~Wolff. \<^isadof>: Design and implementation. In P.C.~{\"O}lveczky and
G.~Sala{\"u}n, editors, \<^emph>\<open>Software Engineering and Formal Methods (SEFM)\<close>, number 11724 in

View File

@ -27,17 +27,17 @@ figure*[architecture::figure,relative_width="95",file_src="''figures/isabelle-ar
text*[bg::introduction]\<open>
While Isabelle is widely perceived as an interactive theorem
prover for HOL (Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we would like to emphasize
prover for HOL (Higher-order Logic)~\<^cite>\<open>"nipkow.ea:isabelle:2002"\<close>, we would like to emphasize
the view that Isabelle is far more than that: it is the \<^emph>\<open>Eclipse of Formal Methods Tools\<close>. This
refers to the ``\<^emph>\<open>generic system framework of Isabelle/Isar underlying recent versions of Isabelle.
Among other things, Isabelle provides an infrastructure for Isabelle plug-ins, comprising extensible
state components and extensible syntax that can be bound to SML programs. Thus, the Isabelle
architecture may be understood as an extension and refinement of the traditional `LCF approach',
with explicit infrastructure for building derivative systems.\<close>''~@{cite "wenzel.ea:building:2007"}
with explicit infrastructure for building derivative systems.\<close>''~\<^cite>\<open>"wenzel.ea:building:2007"\<close>
The current system framework offers moreover the following features:
\<^item> a build management grouping components into to pre-compiled sessions,
\<^item> a prover IDE (PIDE) framework~@{cite "wenzel:asynchronous:2014"} with various front-ends,
\<^item> a prover IDE (PIDE) framework~\<^cite>\<open>"wenzel:asynchronous:2014"\<close> with various front-ends,
\<^item> documentation-generation,
\<^item> code generators for various target languages,
\<^item> an extensible front-end language Isabelle/Isar, and,
@ -118,9 +118,9 @@ text\<open> A simple text-element \<^index>\<open>text-element\<close> may look
@{boxed_theory_text [display]\<open>
text\<open> This is a simple text.\<close>\<close>}
... so it is a command \<^theory_text>\<open>text\<close> followed by an argument (here in \<open>\<open> ... \<close>\<close> parenthesis) which
contains characters. While \<^theory_text>\<open>text\<close>- elements play a major role in this manual --- document
generation is the main use-case of \<^dof> in its current stage --- it is important to note that there
\ldots so it is a command \<^theory_text>\<open>text\<close> followed by an argument (here in \<open>\<open> ... \<close>\<close> parenthesis) which
contains characters. While \<^theory_text>\<open>text\<close>-elements play a major role in this manual---document
generation is the main use-case of \<^dof> in its current stage---it is important to note that there
are actually three families of ``ontology aware'' document elements with analogous
syntax to standard ones. The difference is a bracket with meta-data of the form:
@{theory_text [display,indent=5, margin=70]
@ -130,7 +130,7 @@ ML*[label::classid, attr\<^sub>1=E\<^sub>1, ... attr\<^sub>n=E\<^sub>n]\<open> s
value*[label::classid, attr\<^sub>1=E\<^sub>1, ... attr\<^sub>n=E\<^sub>n]\<open> some annotated \<lambda>-term \<close>
\<close>}
Other instances of document elements belonging to these families are, for example, the freeform
Other instances of document elements belonging to these families are, for example, the free-form
\<^theory_text>\<open>Definition*\<close> and \<^theory_text>\<open>Lemma*\<close> as well as their formal counterparts \<^theory_text>\<open>definition*\<close> and \<^theory_text>\<open>lemma*\<close>,
which allow in addition to their standard Isabelle functionality the creation and management of
ontology-generated meta-data associated to them (cf. -@{text_section (unchecked) "writing_doc"}).
@ -151,7 +151,7 @@ to switch from a text-context to a term-context, for example. Therefore, antiquo
literature, this concept has been referred to \<open>Cascade-Syntax\<close> and was used in the
Centaur-system and is existing in some limited form in some Emacs-implementations these days. \<close>
Isabelle comes with a number of built-in antiquotations for text- and code-contexts;
a detailed overview can be found in @{cite "wenzel:isabelle-isar:2020"}. \<^dof> reuses this general
a detailed overview can be found in \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>. \<^dof> reuses this general
infrastructure but \<^emph>\<open>generates\<close> its own families of antiquotations from ontologies.\<close>
text\<open> An example for a text-element \<^index>\<open>text-element\<close> using built-in antoquotations
@ -203,28 +203,28 @@ text\<open>
(protected by a kernel). This includes, \<^eg>, ProofPower, HOL4, HOL-light, Isabelle, or Coq
and its derivatives. \<^dof> is, however, designed for fast interaction in an IDE. If a user wants
to benefit from this experience, only Isabelle and Coq have the necessary infrastructure of
asynchronous proof-processing and support by a PIDE~@{cite "wenzel:asynchronous:2014" and
"wenzel:system:2014" and "barras.ea:pervasive:2013" and "faithfull.ea:coqoon:2018"} which
asynchronous proof-processing and support by a PIDE~\<^cite>\<open>"wenzel:asynchronous:2014" and
"wenzel:system:2014" and "barras.ea:pervasive:2013" and "faithfull.ea:coqoon:2018"\<close> which
in many features over-accomplishes the required features of \<^dof>.
\<close>
figure*["fig_dof_ide",relative_width="95",file_src="''figures/cicm2018-combined.png''"]\<open>
The \<^isadof> IDE (left) and the corresponding PDF (right), showing the first page
of~@{cite "brucker.ea:isabelle-ontologies:2018"}.\<close>
of~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>.\<close>
text\<open>
We call the present implementation of \<^dof> on the Isabelle platform \<^isadof> .
@{figure "fig_dof_ide"} shows a screen-shot of an introductory paper on
\<^isadof>~@{cite "brucker.ea:isabelle-ontologies:2018"}: the \<^isadof> PIDE can be seen on the left,
@{figure "fig_dof_ide"} shows a screenshot of an introductory paper on
\<^isadof>~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>: the \<^isadof> PIDE can be seen on the left,
while the generated presentation in PDF is shown on the right.
Isabelle provides, beyond the features required for \<^dof>, a lot of additional benefits.
Besides UTF8-support for characters used in text-elements, Isabelle offers built-in already a
mechanism for user-programmable antiquotations \<^index>\<open>antiquotations\<close> which we use to implement
semantic macros \<^index>\<open>semantic macros\<close> in \<^isadof> (We will actually use these two terms
semantic macros \<^index>\<open>semantic macros\<close> in \<^isadof> (We will actually use these two terms
as synonym in the context of \<^isadof>). Moreover, \<^isadof> allows for the asynchronous
evaluation and checking of the document content~@{cite "wenzel:asynchronous:2014" and
"wenzel:system:2014" and "barras.ea:pervasive:2013"} and is dynamically extensible. Its PIDE
evaluation and checking of the document content~\<^cite>\<open>"wenzel:asynchronous:2014" and
"wenzel:system:2014" and "barras.ea:pervasive:2013"\<close> and is dynamically extensible. Its PIDE
provides a \<^emph>\<open>continuous build, continuous check\<close> functionality, syntax highlighting, and
auto-completion. It also provides infrastructure for displaying meta-information (\<^eg>, binding
and type annotation) as pop-ups, while hovering over sub-expressions. A fine-grained dependency

View File

@ -74,15 +74,15 @@ is currently consisting out of three AFP entries:
contains the \<^isadof> system itself, including the \<^isadof> manual.
\<^item> \<^verbatim>\<open>Isabelle_DOF-Example-I\<close>: This entry contains an example of
an academic paper written using the \<^isadof> system oriented towards an
introductory paper. The text is based on~@{cite "brucker.ea:isabelle-ontologies:2018"};
introductory paper. The text is based on~\<^cite>\<open>"brucker.ea:isabelle-ontologies:2018"\<close>;
in the document, we deliberately refrain from integrating references to formal content in order
to demonstrate that \<^isadof> can be used for writing documents with very little direct use of
\<^LaTeX>.
\<^item> \<^verbatim>\<open>Isabelle_DOF-Example-II\<close>: This entry contains another example of
a mathematics-oriented academic paper. It is based on~@{cite "taha.ea:philosophers:2020"}.
It represents a typical mathematical text, heavy in definitions with complex mathematical notation
and a lot of non-trivial cross-referencing between statements, definitions and proofs which
are ontologically tracked. However, wrt. the possible linking between the underlying formal theory
a mathematics-oriented academic paper. It is based on~\<^cite>\<open>"taha.ea:philosophers:2020"\<close>.
It represents a typical mathematical text, heavy in definitions with complex mathematical notation
and a lot of non-trivial cross-referencing between statements, definitions, and proofs which
are ontologically tracked. However, with respect to the possible linking between the underlying formal theory
and this mathematical presentation, it follows a pragmatic path without any ``deep'' linking to
types, terms and theorems, and therefore does deliberately not exploit \<^isadof> 's full potential.\<close>
@ -177,7 +177,7 @@ There are actually different name categories that form a proper name space, \<^e
constant symbols and type symbols are distinguished.
Additionally, \<^isadof> objects also come with a proper name space: classes (and monitors), instances,
low-level class invariants (SML-defined invariants) all follow the lexical conventions of
Isabelle. For instance, a class can be referenced outside of its theory using
Isabelle. For instance, a class can be referenced outside its theory using
its short-name or its long-name if another class with the same name is defined
in the current theory.
Isabelle identifies names already with the shortest suffix that is unique in the global
@ -188,7 +188,7 @@ be printed with different prefixes according to their uniqueness.
subsection*[cartouches::example]\<open>Caveat: Lexical Conventions of Cartouches, Strings, Names, ... \<close>
text\<open>WARNING: The embedding of strings, terms, names \<^etc>, as parts of commands, anti-quotations,
terms, \<^etc>, is unfortunately not always so consistent as one might expect, when it comes
terms, \<^etc>, is unfortunately not always so consistent as one might expect, when it comes
to variants that should be lexically equivalent in principle. This can be a nuisance for
users, but is again a consequence that we build on existing technology that has been developed
over decades.
@ -289,7 +289,7 @@ contained in the theory \<^theory>\<open>Isabelle_DOF.scholarly_paper\<close>. \
subsection\<open>Writing Academic Publications: A Freeform Mathematics Text \<close>
text*[csp_paper_synthesis::technical, main_author = "Some bu"]\<open>We present a typical mathematical
paper focusing on its form, not referring in any sense to its content which is out of scope here.
As mentioned before, we chose the paper~@{cite "taha.ea:philosophers:2020"} for this purpose,
As mentioned before, we chose the paper~\<^cite>\<open>"taha.ea:philosophers:2020"\<close> for this purpose,
which is written in the so-called free-form style: Formulas are superficially parsed and
type-set, but no deeper type-checking and checking with the underlying logical context
is undertaken. \<close>
@ -629,7 +629,7 @@ text\<open> This is *\<open>emphasized\<close> and this is a
citation @{cite "brucker.ea:isabelle-ontologies:2018"}.\<close>
\<close>}
The list of standard Isabelle document antiquotations, as well as their options and styles,
can be found in the Isabelle reference manual @{cite "wenzel:isabelle-isar:2020"},
can be found in the Isabelle reference manual \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>,
section 4.2.
In practice, \<^isadof> documents with ambitious layout will contain a certain number of
@ -638,7 +638,7 @@ not possible. As far as possible, raw \<^LaTeX> should be restricted to the defi
of ontologies and macros (see @{docitem (unchecked) \<open>isadof_ontologies\<close>}). If raw
\<^LaTeX> commands can not be avoided, it is recommended to use the Isabelle document comment
\<^latex>\<open>\verb+\+\verb+<^latex>+\<close>\<open>\<open>argument\<close>\<close> to isolate these parts
(cf. @{cite "wenzel:isabelle-isar:2020"}).
(cf. \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>).
Restricting the use of \<^LaTeX> has two advantages: first, \<^LaTeX> commands can circumvent the
consistency checks of \<^isadof> and, hence, only if no \<^LaTeX> commands are used, \<^isadof> can

View File

@ -0,0 +1,227 @@
(*************************************************************************
* Copyright (C)
* 2019-2022 University of Exeter
* 2018-2022 University of Paris-Saclay
* 2018 The University of Sheffield
*
* License:
* This program can be redistributed and/or modified under the terms
* of the 2-clause BSD-style license.
*
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
(*<*)
theory
"M_04_Document_Ontology"
imports
"M_03_GuidedTour"
keywords "class_synonym" :: thy_defn
begin
(*>*)
(*<*)
definition combinator1 :: "'a \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> 'b" (infixl "|>" 65)
where "x |> f = f x"
ML\<open>
local
val _ =
Outer_Syntax.local_theory \<^command_keyword>\<open>class_synonym\<close> "declare type abbreviation"
(Parse.type_args -- Parse.binding --
(\<^keyword>\<open>=\<close> |-- Parse.!!! (Parse.typ -- Parse.opt_mixfix'))
>> (fn ((args, a), (rhs, mx)) => snd o Typedecl.abbrev_cmd (a, args, mx) rhs));
in end
\<close>
(*>*)
(*<*)
doc_class "title" = short_title :: "string option" <= "None"
doc_class elsevier =
organization :: string
address_line :: string
postcode :: int
city :: string
(*doc_class elsevier_affiliation = affiliation +*)
doc_class acm =
position :: string
institution :: string
department :: int
street_address :: string
city :: string
state :: int
country :: string
postcode :: int
(*doc_class acm_affiliation = affiliation +*)
doc_class lncs =
institution :: string
(*doc_class lncs_affiliation = affiliation +*)
doc_class author =
name :: string
email :: "string" <= "''''"
invariant ne_name :: "name \<sigma> \<noteq> ''''"
doc_class elsevier_author = "author" +
affiliations :: "elsevier list"
short_author :: string
url :: string
footnote :: string
text*[el1:: "elsevier"]\<open>\<close>
(*text*[el_aff1:: "affiliation", journal_style = "@{elsevier \<open>el1\<close>}"]\<open>\<close>*)
term*\<open>@{elsevier \<open>el1\<close>}\<close>
text*[el_auth1:: "elsevier_author", affiliations = "[@{elsevier \<open>el1\<close>}]"]\<open>\<close>
doc_class acm_author = "author" +
affiliations :: "acm list"
orcid :: int
footnote :: string
text*[acm1:: "acm"]\<open>\<close>
(*text*[acm_aff1:: "acm affiliation", journal_style = "@{acm \<open>acm1\<close>}"]\<open>\<close>*)
text*[acm_auth1:: "acm_author", affiliations = "[@{acm \<open>acm1\<close>}]"]\<open>\<close>
doc_class lncs_author = "author" +
affiliations :: "lncs list"
orcid :: int
short_author :: string
footnote :: string
text*[lncs1:: "lncs"]\<open>\<close>
(*text*[lncs_aff1:: "lncs affiliation", journal_style = "@{lncs \<open>lncs1\<close>}"]\<open>\<close>*)
text*[lncs_auth1:: "lncs_author", affiliations = "[@{lncs \<open>lncs1\<close>}]"]\<open>\<close>
doc_class "text_element" =
authored_by :: "author set" <= "{}"
level :: "int option" <= "None"
invariant authors_req :: "authored_by \<sigma> \<noteq> {}"
and level_req :: "the (level \<sigma>) > 1"
doc_class "introduction" = "text_element" +
authored_by :: "(author) set" <= "UNIV"
doc_class "technical" = "text_element" +
formal_results :: "thm list"
doc_class "definition" = "technical" +
is_formal :: "bool"
doc_class "theorem" = "technical" +
is_formal :: "bool"
assumptions :: "term list" <= "[]"
statement :: "term option" <= "None"
doc_class "conclusion" = "text_element" +
resumee :: "(definition set \<times> theorem set)"
invariant is_form :: "(\<exists>x\<in>(fst (resumee \<sigma>)). definition.is_formal x) \<longrightarrow>
(\<exists>y\<in>(snd (resumee \<sigma>)). is_formal y)"
text*[def::"definition", is_formal = "True"]\<open>\<close>
text*[theo::"theorem", is_formal = "False"]\<open>\<close>
text*[conc::"conclusion", resumee="({@{definition \<open>def\<close>}}, {@{theorem \<open>theo\<close>}})"]\<open>\<close>
value*\<open>resumee @{conclusion \<open>conc\<close>} |> fst\<close>
value*\<open>resumee @{conclusion \<open>conc\<close>} |> snd\<close>
doc_class "article" =
style_id :: string <= "''LNCS''"
accepts "(title ~~ \<lbrace>author\<rbrace>\<^sup>+ ~~ \<lbrace>introduction\<rbrace>\<^sup>+
~~ \<lbrace>\<lbrace>definition ~~ example\<rbrace>\<^sup>+ || theorem\<rbrace>\<^sup>+ ~~ \<lbrace>conclusion\<rbrace>\<^sup>+)"
datatype kind = expert_opinion | argument | "proof"
onto_class result = " technical" +
evidence :: kind
property :: " theorem list" <= "[]"
invariant has_property :: "evidence \<sigma> = proof \<longleftrightarrow> property \<sigma> \<noteq> []"
(*>*)
text*[paper_m::float, main_caption="\<open>A Basic Document Ontology: paper$^m$\<close>"]\<open>
@{boxed_theory_text [display,indent=5]
\<open>doc_class "title" = short_title :: "string option" <= "None"
doc_class affiliation =
journal_style :: '\<alpha>
doc_class author =
affiliations :: "'\<alpha> affiliation list"
name :: string
email :: "string" <= "''''"
invariant ne_name :: "name \<sigma> \<noteq> ''''"
doc_class "text_element" =
authored_by :: "('\<alpha> author) set" <= "{}"
level :: "int option" <= "None"
invariant authors_req :: "authored_by \<noteq> {}"
and level_req :: "the (level) > 1"
doc_class "introduction" = text_element +
authored_by :: "('\<alpha> author) set" <= "UNIV"
doc_class "technical" = text_element +
formal_results :: "thm list"
doc_class "definition" = technical +
is_formal :: "bool"
doc_class "theorem" = technical +
assumptions :: "term list" <= "[]"
statement :: "term option" <= "None"
doc_class "conclusion" = text_element +
resumee :: "(definition set \<times> theorem set)"
invariant (\<forall>x\<in>fst resumee. is_formal x)
\<longrightarrow> (\<exists>y\<in>snd resumee. is_formal y)
doc_class "article" =
style_id :: string <= "''LNCS''"
accepts "(title ~~ \<lbrace>author\<rbrace>\<^sup>+ ~~ \<lbrace>introduction\<rbrace>\<^sup>+
~~ \<lbrace>\<lbrace>definition ~~ example\<rbrace>\<^sup>+ || theorem\<rbrace>\<^sup>+ ~~ \<lbrace>conclusion\<rbrace>\<^sup>+)"\<close>}
\<close>
(*<*)
datatype role = PM \<comment> \<open>Program Manager\<close>
| RQM \<comment> \<open>Requirements Manager\<close>
| DES \<comment> \<open>Designer\<close>
| IMP \<comment> \<open>Implementer\<close>
| ASR \<comment> \<open>Assessor\<close>
| INT \<comment> \<open>Integrator\<close>
| TST \<comment> \<open>Tester\<close>
| VER \<comment> \<open>Verifier\<close>
| VnV \<comment> \<open>Verification and Validation\<close>
| VAL \<comment> \<open>Validator\<close>
abbreviation developer where "developer == DES"
abbreviation validator where "validator == VAL"
abbreviation verifier where "verifier == VER"
doc_class requirement = Isa_COL.text_element +
long_name :: "string option"
is_concerned :: "role set"
text*[req1::requirement,
is_concerned="{developer, validator}"]
\<open>The operating system shall provide secure
memory separation.\<close>
text\<open>
The recurring issue of the certification
is @{requirement \<open>req1\<close>} ...\<close>
term "\<lparr>long_name = None,is_concerned = {developer,validator}\<rparr>"
(*>*)
(*<*)
end
(*>*)

View File

@ -0,0 +1,454 @@
(*************************************************************************
* Copyright (C)
* 2019-2022 University of Exeter
* 2018-2022 University of Paris-Saclay
* 2018 The University of Sheffield
*
* License:
* This program can be redistributed and/or modified under the terms
* of the 2-clause BSD-style license.
*
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
(*<*)
theory
"M_05_Proofs_Ontologies"
imports
"M_04_Document_Ontology"
begin
(*>*)
chapter*[onto_proofs::technical]\<open>Proofs over Ontologies\<close>
text\<open>It is a distinguishing feature of \<^dof> that it does not directly generate meta-data rather
than generating a \<^emph>\<open>theory of meta-data\<close> that can be used in HOL-terms on various levels
of the Isabelle-system and its document generation technology. Meta-data theories can be
converted into executable code and efficiently used in validations, but also used for theoretic
reasoning over given ontologies. While the full potential of this latter possibility still
needs to be explored, we present in the following sections two applications:
\<^enum> Verified ontological morphisms, also called \<^emph>\<open>ontological mappings\<close> in the literature
\<^cite>\<open>"books/daglib/0032976" and "atl" and "BGPP95"\<close>, \<^ie> proofs of invariant preservation
along translation-functions of all instances of \<^verbatim>\<open>doc_class\<close>-es.
\<^enum> Verified refinement relations between the structural descriptions of theory documents,
\<^ie> proofs of language inclusions of monitors of global ontology classes.
\<close>
section*["morphisms"::scholarly_paper.text_section] \<open>Proving Properties over Ontologies\<close>
subsection\<open>Ontology-Morphisms: a Prototypical Example\<close>
text\<open>We define a small ontology with the following classes:\<close>
doc_class AA = aa :: nat
doc_class BB = bb :: int
doc_class CC = cc :: int
doc_class DD = dd :: int
doc_class EE = ee :: int
doc_class FF = ff :: int
onto_morphism (AA, BB) to CC and (DD, EE) to FF
where "convert\<^sub>A\<^sub>A\<^sub>\<times>\<^sub>B\<^sub>B\<^sub>\<Rightarrow>\<^sub>C\<^sub>C \<sigma> = \<lparr> CC.tag_attribute = 1::int,
CC.cc = int(aa (fst \<sigma>)) + bb (snd \<sigma>)\<rparr>"
and "convert\<^sub>D\<^sub>D\<^sub>\<times>\<^sub>E\<^sub>E\<^sub>\<Rightarrow>\<^sub>F\<^sub>F \<sigma> = \<lparr> FF.tag_attribute = 1::int,
FF.ff = dd (fst \<sigma>) + ee (snd \<sigma>) \<rparr>"
text\<open>Note that the \<^term>\<open>convert\<^sub>A\<^sub>A\<^sub>\<times>\<^sub>B\<^sub>B\<^sub>\<Rightarrow>\<^sub>C\<^sub>C\<close>-morphism involves a data-format conversion, and that the
resulting transformation of @{doc_class AA}-instances and @{doc_class BB}-instances is surjective
but not injective. The \<^term>\<open>CC.tag_attribute\<close> is used to potentially differentiate instances with
equal attribute-content and is irrelevant here.\<close>
(*<*) (* Just a test, irrelevant for the document.*)
doc_class A_A = aa :: nat
doc_class BB' = bb :: int
onto_morphism (A_A, BB', CC, DD, EE) to FF
where "convert\<^sub>A\<^sub>_\<^sub>A\<^sub>\<times>\<^sub>B\<^sub>B\<^sub>'\<^sub>\<times>\<^sub>C\<^sub>C\<^sub>\<times>\<^sub>D\<^sub>D\<^sub>\<times>\<^sub>E\<^sub>E\<^sub>\<Rightarrow>\<^sub>F\<^sub>F \<sigma> = \<lparr> FF.tag_attribute = 1::int,
FF.ff = int(aa (fst \<sigma>)) + bb (fst (snd \<sigma>))\<rparr>"
(*>*)
text\<open>This specification construct introduces the following constants and definitions:
\<^item> @{term [source] \<open>convert\<^sub>A\<^sub>A\<^sub>_\<^sub>B\<^sub>B\<^sub>_\<^sub>C\<^sub>C :: AA \<times> BB \<Rightarrow> CC\<close>}
\<^item> @{term [source] \<open>convert\<^sub>D\<^sub>D\<^sub>_\<^sub>E\<^sub>E\<^sub>_\<^sub>F\<^sub>F :: DD \<times> EE \<Rightarrow> FF\<close>}
% @{term [source] \<open>convert\<^sub>A\<^sub>_\<^sub>A\<^sub>\<times>\<^sub>B\<^sub>B\<^sub>'\<^sub>\<times>\<^sub>C\<^sub>C\<^sub>\<times>\<^sub>D\<^sub>D\<^sub>\<times>\<^sub>E\<^sub>E\<^sub>\<Rightarrow>\<^sub>F\<^sub>F :: A_A \<times> BB' \<times> CC \<times> DD \<times> EE \<Rightarrow> FF\<close>}
and corresponding definitions. \<close>
subsection\<open>Proving the Preservation of Ontological Mappings : A Document-Ontology Morphism\<close>
text\<open>\<^dof> as a system is currently particularly geared towards \<^emph>\<open>document\<close>-ontologies, in
particular for documentations generated from Isabelle theories. We used it meanwhile for the
generation of various conference and journal papers, notably using the
\<^theory>\<open>Isabelle_DOF.scholarly_paper\<close> and \<^theory>\<open>Isabelle_DOF.technical_report\<close>-ontologies,
targeting a (small) variety of \<^LaTeX> style-files. A particular aspect of these ontologies,
especially when targeting journals from publishers such as ACM, Springer or Elsevier, is the
description of publication meta-data. Publishers tend to have their own styles on what kind
meta-data should be associated with a journal publication; thus, the depth and
precise format of affiliations, institution, their relation to authors, and author descriptions
(with photos or without, hair left-combed or right-combed, etcpp...) varies.
In the following, we present an attempt to generalized ontology with several ontology mappings
to more specialized ones such as concrete journals and/or the \<^theory>\<open>Isabelle_DOF.scholarly_paper\<close>-
ontology which we mostly used for our own publications.
\<close>
doc_class elsevier_org =
organization :: string
address_line :: string
postcode :: int
city :: string
doc_class acm_org =
position :: string
institution :: string
department :: int
street_address :: string
city :: string
state :: int
country :: string
postcode :: int
doc_class lncs_inst =
institution :: string
doc_class author =
name :: string
email :: "string" <= "''''"
invariant ne_fsnames :: "name \<sigma> \<noteq> ''''"
doc_class elsevierAuthor = "author" +
affiliations :: "elsevier_org list"
firstname :: string
surname :: string
short_author :: string
url :: string
footnote :: string
invariant ne_fsnames :: "firstname \<sigma> \<noteq> '''' \<and> surname \<sigma> \<noteq> ''''"
(*<*)
text*[el1:: "elsevier_org"]\<open>An example elsevier-journal affiliation.\<close>
term*\<open>@{elsevier_org \<open>el1\<close>}\<close>
text*[el_auth1:: "elsevierAuthor", affiliations = "[@{elsevier_org \<open>el1\<close>}]"]\<open>\<close>
(*>*)
text\<open>\<close>
doc_class acmAuthor = "author" +
affiliations :: "acm_org list"
firstname :: string
familyname :: string
orcid :: int
footnote :: string
invariant ne_fsnames :: "firstname \<sigma> \<noteq> '''' \<and> familyname \<sigma> \<noteq> ''''"
(*<*)
text*[acm1:: "acm"]\<open>An example acm-style affiliation\<close>
(*>*)
text\<open>\<close>
doc_class lncs_author = "author" +
affiliations :: "lncs list"
orcid :: int
short_author :: string
footnote :: string
(*<*)
text*[lncs1:: "lncs"]\<open>An example lncs-style affiliation\<close>
text*[lncs_auth1:: "lncs_author", affiliations = "[@{lncs \<open>lncs1\<close>}]"]\<open>Another example lncs-style affiliation\<close>
find_theorems elsevier.tag_attribute
(*>*)
text\<open>\<close>
definition acm_name where "acm_name f s = f @ '' '' @ s"
fun concatWith :: "string \<Rightarrow> string list \<Rightarrow> string"
where "concatWith str [] = ''''"
|"concatWith str [a] = a"
|"concatWith str (a#R) = a@str@(concatWith str R)"
lemma concatWith_non_mt : "(S\<noteq>[] \<and> (\<exists> s\<in>set S. s\<noteq>'''')) \<longrightarrow> (concatWith sep S) \<noteq> ''''"
proof(induct S)
case Nil
then show ?case by simp
next
case (Cons a S)
then show ?case apply(auto)[1]
using concatWith.elims apply blast
using concatWith.elims apply blast
using list.set_cases by force
qed
onto_morphism (acm) to elsevier
where "convert\<^sub>a\<^sub>c\<^sub>m\<^sub>\<Rightarrow>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r \<sigma> =
\<lparr>elsevier.tag_attribute = acm.tag_attribute \<sigma>,
organization = acm.institution \<sigma>,
address_line = concatWith '','' [acm.street_address \<sigma>, acm.city \<sigma>],
postcode = acm.postcode \<sigma> ,
city = acm.city \<sigma> \<rparr>"
text\<open>Here is a more basic, but equivalent definition for the other way round:\<close>
definition elsevier_to_acm_morphism :: "elsevier_org \<Rightarrow> acm_org"
("_ \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r" [1000]999)
where "\<sigma> \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r = \<lparr> acm_org.tag_attribute = 1::int,
acm_org.position = ''no position'',
acm_org.institution = organization \<sigma>,
acm_org.department = 0,
acm_org.street_address = address_line \<sigma>,
acm_org.city = elsevier_org.city \<sigma>,
acm_org.state = 0,
acm_org.country = ''no country'',
acm_org.postcode = elsevier_org.postcode \<sigma> \<rparr>"
text\<open>The following onto-morphism links \<^typ>\<open>elsevierAuthor\<close>'s and \<^typ>\<open>acmAuthor\<close>. Note that
the conversion implies trivial data-conversions (renaming of attributes in the classes),
string-representation conversions, and conversions of second-staged, referenced instances.\<close>
onto_morphism (elsevierAuthor) to acmAuthor
where "convert\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r\<^sub>A\<^sub>u\<^sub>t\<^sub>h\<^sub>o\<^sub>r\<^sub>\<Rightarrow>\<^sub>a\<^sub>c\<^sub>m\<^sub>A\<^sub>u\<^sub>t\<^sub>h\<^sub>o\<^sub>r \<sigma> =
\<lparr>author.tag_attribute = undefined,
name = concatWith '','' [elsevierAuthor.firstname \<sigma>,elsevierAuthor.surname \<sigma>],
email = author.email \<sigma>,
acmAuthor.affiliations = (elsevierAuthor.affiliations \<sigma>)
|> map (\<lambda>x. x \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r),
firstname = elsevierAuthor.firstname \<sigma>,
familyname = elsevierAuthor.surname \<sigma>,
orcid = 0, \<comment> \<open>la triche ! ! !\<close>
footnote = elsevierAuthor.footnote \<sigma>\<rparr>"
lemma elsevier_inv_preserved :
"elsevierAuthor.ne_fsnames_inv \<sigma>
\<Longrightarrow> acmAuthor.ne_fsnames_inv (convert\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r\<^sub>A\<^sub>u\<^sub>t\<^sub>h\<^sub>o\<^sub>r\<^sub>\<Rightarrow>\<^sub>a\<^sub>c\<^sub>m\<^sub>A\<^sub>u\<^sub>t\<^sub>h\<^sub>o\<^sub>r \<sigma>)
\<and> author.ne_fsnames_inv (convert\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r\<^sub>A\<^sub>u\<^sub>t\<^sub>h\<^sub>o\<^sub>r\<^sub>\<Rightarrow>\<^sub>a\<^sub>c\<^sub>m\<^sub>A\<^sub>u\<^sub>t\<^sub>h\<^sub>o\<^sub>r \<sigma>)"
unfolding elsevierAuthor.ne_fsnames_inv_def acmAuthor.ne_fsnames_inv_def
convert\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r\<^sub>A\<^sub>u\<^sub>t\<^sub>h\<^sub>o\<^sub>r\<^sub>_\<^sub>a\<^sub>c\<^sub>m\<^sub>A\<^sub>u\<^sub>t\<^sub>h\<^sub>o\<^sub>r_def author.ne_fsnames_inv_def
by auto
text\<open>The proof is, in order to quote Tony Hoare, ``as simple as it should be''. Note that it involves
the lemmas like @{thm concatWith_non_mt} which in itself require inductions, \<^ie>, which are out of
reach of pure ATP proof-techniques. \<close>
subsection\<open>Proving the Preservation of Ontological Mappings : A Domain-Ontology Morphism\<close>
text\<open>The following example is drawn from a domain-specific scenario: For conventional data-models,
be it represented by UML-class diagrams or SQL-like "tables" or Excel-sheet like presentations
of uniform data, we can conceive an ontology (which is equivalent here to a conventional style-sheet)
and annotate textual raw data. This example describes how meta-data can be used to
calculate and transform this kind of representations in a type-safe and verified way. \<close>
(*<*)
(* Mapped_PILIB_Ontology example *)
(* rethink examples: should we "morph" providence too ? ? ? Why not ? bu *)
term\<open>fold (+) S 0\<close>
definition sum
where "sum S = (fold (+) S 0)"
(*>*)
text\<open>We model some basic enumerations as inductive data-types: \<close>
datatype Hardware_Type =
Motherboard | Expansion_Card | Storage_Device | Fixed_Media |
Removable_Media | Input_Device | Output_Device
datatype Software_Type =
Operating_system | Text_editor | Web_Navigator | Development_Environment
text\<open>In the sequel, we model a ''Reference Ontology'', \<^ie> a data structure in which we assume
that standards or some de-facto-standard data-base refer to the data in the domain of electronic
devices:\<close>
onto_class Resource =
name :: string
onto_class Electronic = Resource +
provider :: string
manufacturer :: string
onto_class Component = Electronic +
mass :: int
onto_class Simulation_Model = Electronic +
simulate :: Component
composed_of :: "Component list"
version :: int
onto_class Informatic = Resource +
description :: string
onto_class Hardware = Informatic +
type :: Hardware_Type
mass :: int
composed_of :: "Component list"
invariant c1 :: "mass \<sigma> = sum(map Component.mass (composed_of \<sigma>))"
onto_class Software = Informatic +
type :: Software_Type
version :: int
text\<open>Finally, we present a \<^emph>\<open>local ontology\<close>, \<^ie> a data structure used in a local store
in its data-base of cash-system:\<close>
onto_class Item =
name :: string
onto_class Product = Item +
serial_number :: int
provider :: string
mass :: int
onto_class Electronic_Component = Product +
serial_number :: int
onto_class Monitor = Product +
composed_of :: "Electronic_Component list"
invariant c2 :: "Product.mass \<sigma> = sum(map Product.mass (composed_of \<sigma>))"
term\<open>Product.mass \<sigma> = sum(map Product.mass (composed_of \<sigma>))\<close>
onto_class Computer_Software = Item +
type :: Software_Type
version :: int
text\<open>These two example ontologies were linked via conversion functions called \<^emph>\<open>morphisms\<close>.
The hic is that we can prove for the morphisms connecting these ontologies, that the conversions
are guaranteed to preserve the data-invariants, although the data-structures (and, of course,
the presentation of them) is very different. Besides, morphisms functions can be ``forgetful''
(\<^ie> surjective), ``embedding'' (\<^ie> injective) or even ``one-to-one'' ((\<^ie> bijective).\<close>
definition Item_to_Resource_morphism :: "Item \<Rightarrow> Resource"
("_ \<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m" [1000]999)
where " \<sigma> \<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m =
\<lparr> Resource.tag_attribute = 1::int ,
Resource.name = name \<sigma> \<rparr>"
definition Product_to_Resource_morphism :: "Product \<Rightarrow> Resource"
("_ \<langle>Resource\<rangle>\<^sub>P\<^sub>r\<^sub>o\<^sub>d\<^sub>u\<^sub>c\<^sub>t" [1000]999)
where " \<sigma> \<langle>Resource\<rangle>\<^sub>P\<^sub>r\<^sub>o\<^sub>d\<^sub>u\<^sub>c\<^sub>t =
\<lparr> Resource.tag_attribute = 2::int ,
Resource.name = name \<sigma> \<rparr>"
definition Computer_Software_to_Software_morphism :: "Computer_Software \<Rightarrow> Software"
("_ \<langle>Software\<rangle>\<^sub>S\<^sub>o\<^sub>f\<^sub>t\<^sub>C\<^sub>m\<^sub>p" [1000]999)
where "\<sigma> \<langle>Software\<rangle>\<^sub>S\<^sub>o\<^sub>f\<^sub>t\<^sub>C\<^sub>m\<^sub>p =
\<lparr> Resource.tag_attribute = 3::int ,
Resource.name = name \<sigma> ,
Informatic.description = ''no description'',
Software.type = type \<sigma> ,
Software.version = version \<sigma> \<rparr>"
definition Electronic_Component_to_Component_morphism :: "Electronic_Component \<Rightarrow> Component"
("_ \<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p" [1000]999)
where "\<sigma> \<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p =
\<lparr> Resource.tag_attribute = 4::int ,
Resource.name = name \<sigma> ,
Electronic.provider = provider \<sigma> ,
Electronic.manufacturer = ''no manufacturer'' ,
Component.mass = mass \<sigma> \<rparr>"
definition Monitor_to_Hardware_morphism :: "Monitor \<Rightarrow> Hardware"
("_ \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e" [1000]999)
where "\<sigma> \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e =
\<lparr> Resource.tag_attribute = 5::int ,
Resource.name = name \<sigma> ,
Informatic.description = ''no description'',
Hardware.type = Output_Device,
Hardware.mass = mass \<sigma> ,
Hardware.composed_of = map Electronic_Component_to_Component_morphism (composed_of \<sigma>)
\<rparr>"
text\<open>On this basis, we can state the following invariant preservation theorems:\<close>
lemma inv_c2_preserved :
"c2_inv \<sigma> \<Longrightarrow> c1_inv (\<sigma> \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e)"
unfolding c1_inv_def c2_inv_def
Monitor_to_Hardware_morphism_def Electronic_Component_to_Component_morphism_def
by (auto simp: comp_def)
lemma Monitor_to_Hardware_morphism_total :
"Monitor_to_Hardware_morphism ` ({X::Monitor. c2_inv X}) \<subseteq> ({X::Hardware. c1_inv X})"
using inv_c2_preserved
by auto
type_synonym local_ontology = "Item * Electronic_Component * Monitor"
type_synonym reference_ontology = "Resource * Component * Hardware"
fun ontology_mapping :: "local_ontology \<Rightarrow> reference_ontology"
where "ontology_mapping (x, y, z) = (x\<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m, y\<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p, z\<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e)"
lemma ontology_mapping_total :
"ontology_mapping ` {X. c2_inv (snd (snd X))} \<subseteq> {X. c1_inv (snd (snd X))}"
using inv_c2_preserved
by auto
text\<open>Note that in contrast to conventional data-translations, the preservation of a class-invariant
is not just established by a validation of the result, it is proven once and for all for all instances
of the classes.\<close>
subsection\<open>Proving Monitor-Refinements\<close>
(*<*)
(* switch on regexp syntax *)
notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
notation Plus (infixr "||" 55)
notation Times (infixr "~~" 60)
notation Atom ("\<lfloor>_\<rfloor>" 65)
(*>*)
text\<open>Monitors are regular-expressions that allow for specifying instances of classes to appear in
a particular order in a document. They are used to specify some structural aspects of a document.
Based on an AFP theory by Tobias Nipkow on Functional Automata
(\<^ie> a characterization of regular automata using functional polymorphic descriptions of transition
functions avoiding any of the ad-hoc finitizations commonly used in automata theory), which
comprises also functions to generate executable deterministic and non-deterministic automata,
this theory is compiled to SML-code that was integrated in the \<^dof> system. The necessary
adaptions of this integration can be found in the theory \<^theory>\<open>Isabelle_DOF.RegExpInterface\<close>,
which also contains the basic definitions and theorems for the concepts used here.
Recall that the monitor of \<^term>\<open>scholarly_paper.article\<close> is defined by: \<^vs>\<open>0.5cm\<close>
@{thm [indent=20, margin=70, names_short] scholarly_paper.article_monitor_def}
\<^vs>\<open>0.5cm\<close> However, it is possible to reason over the language of monitors and prove classical
refinement notions such as trace-refinement on the monitor-level, so once-and-for-all for all
instances of validated documents conforming to a particular ontology. The primitive recursive
operators \<^term>\<open>RegExpInterface.Lang\<close> and \<^term>\<open>RegExpInterface.L\<^sub>s\<^sub>u\<^sub>b\<close> generate the languages of the
regular expression language, where \<^term>\<open>L\<^sub>s\<^sub>u\<^sub>b\<close> takes the sub-ordering relation of classes into
account.
The proof of : \<^vs>\<open>0.5cm\<close>
@{thm [indent=20,margin=70,names_short] articles_sub_reports}
\<^vs>\<open>0.5cm\<close> can be found in theory \<^theory>\<open>Isabelle_DOF.technical_report\<close>;
it is, again, "as simple as it should be" (to cite Tony Hoare).
The proof of: \<^vs>\<open>0.5cm\<close>
@{thm [indent=20,margin=70,names_short] articles_Lsub_reports}
\<^vs>\<open>0.5cm\<close> is slightly more evolved; this is due to the fact that \<^dof> does not generate a proof of
the acyclicity of the graph of the class-hierarchy @{term doc_class_rel} automatically. For a given
hierarchy, this proof will always succeed (since \<^dof> checks this on the meta-level, of course),
which permits to deduce the anti-symmetry of the transitive closure of @{term doc_class_rel}
and therefore to establish that the doc-classes can be organized in an order (\<^ie> the
type \<^typ>\<open>doc_class\<close> is an instance of the type-class \<^class>\<open>order\<close>). On this basis, the proof
of the above language refinement is quasi automatic. This proof is also part of
\<^theory>\<open>Isabelle_DOF.technical_report\<close>.
\<close>
(*<*)
(* switch off regexp syntax *)
no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
no_notation Plus (infixr "||" 55)
no_notation Times (infixr "~~" 60)
no_notation Atom ("\<lfloor>_\<rfloor>" 65)
end
(*>*)

View File

@ -13,9 +13,9 @@
(*<*)
theory
"M_04_RefMan"
"M_06_RefMan"
imports
"M_03_GuidedTour"
"M_05_Proofs_Ontologies"
begin
declare_reference*[infrastructure::technical]
@ -30,7 +30,7 @@ text\<open>
representation, and give hints for the development of new document templates.
\<^isadof> is embedded in the underlying generic document model of Isabelle as described in
\<^introduction>\<open>dof\<close>. Recall that the document language can be extended dynamically, \<^ie>, new
@{scholarly_paper.introduction \<open>dof\<close>}. Recall that the document language can be extended dynamically, \<^ie>, new
\<^emph>\<open>user-defined\<close> can be introduced at run-time. This is similar to the definition of new functions
in an interpreter. \<^isadof> as a system plugin provides a number of new command definitions in
Isabelle's document model.
@ -119,7 +119,7 @@ text\<open>
\<^item> attributes may have default values in order to facilitate notation.
\<^boxed_theory_text>\<open>doc_class\<close>'es and \<^boxed_theory_text>\<open>onto_class\<close>'es respectively, have a semantics,
\<^ie>, a logical representation as extensible records in HOL (@{cite "wenzel:isabelle-isar:2020"},
\<^ie>, a logical representation as extensible records in HOL (\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>,
pp. 11.6); there are therefore amenable to logical reasoning.
\<close>
@ -169,25 +169,25 @@ text\<open>
As ODL is an extension of Isabelle/HOL, document class definitions can therefore be arbitrarily
mixed with standard HOL specification constructs. To make this manual self-contained, we present
syntax and semantics of the specification constructs that are most likely relevant for the
developer of ontologies (for more details, see~@{cite "wenzel:isabelle-isar:2020"}). Our
developer of ontologies (for more details, see~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>). Our
presentation is a simplification of the original sources following the needs of ontology developers
in \<^isadof>:
\<^item> \<open>name\<close>:\<^index>\<open>name@\<open>name\<close>\<close>
with the syntactic category of \<open>name\<close>'s we refer to alpha-numerical identifiers
(called \<open>short_ident\<close>'s in @{cite "wenzel:isabelle-isar:2020"}) and identifiers
(called \<open>short_ident\<close>'s in \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>) and identifiers
in \<^boxed_theory_text>\<open>...\<close> which might contain certain ``quasi-letters'' such
as \<^boxed_theory_text>\<open>_\<close>, \<^boxed_theory_text>\<open>-\<close>, \<^boxed_theory_text>\<open>.\<close> (see~@{cite "wenzel:isabelle-isar:2020"} for
as \<^boxed_theory_text>\<open>_\<close>, \<^boxed_theory_text>\<open>-\<close>, \<^boxed_theory_text>\<open>.\<close> (see~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close> for
details).
% TODO
% This phrase should be reviewed to clarify identifiers.
% Peculiarly, "and identifiers in \<^boxed_theory_text>\<open> ... \<close>".
\<^item> \<open>tyargs\<close>:\<^index>\<open>tyargs@\<open>tyargs\<close>\<close>
\<^rail>\<open> typefree | ('(' (typefree * ',') ')')\<close>
\<open>typefree\<close> denotes fixed type variable (\<open>'a\<close>, \<open>'b\<close>, ...) (see~@{cite "wenzel:isabelle-isar:2020"})
\<open>typefree\<close> denotes fixed type variable (\<open>'a\<close>, \<open>'b\<close>, ...) (see~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>)
\<^item> \<open>dt_name\<close>:\<^index>\<open>dt\_npurdahame@\<open>dt_name\<close>\<close>
\<^rail>\<open> (tyargs?) name (mixfix?)\<close>
The syntactic entity \<open>name\<close> denotes an identifier, \<open>mixfix\<close> denotes the usual
parenthesized mixfix notation (see @{cite "wenzel:isabelle-isar:2020"}).
parenthesized mixfix notation (see \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>).
The \<^emph>\<open>name\<close>'s referred here are type names such as \<^typ>\<open>int\<close>, \<^typ>\<open>string\<close>,
\<^type>\<open>list\<close>, \<^type>\<open>set\<close>, etc.
\<^item> \<open>type_spec\<close>:\index{type_spec@\<open>type_spec\<close>}
@ -212,13 +212,13 @@ text\<open>
% Show string in the document output for \<^boxed_theory_text>\<open>ab c\<close> (strings)
\<^boxed_theory_text>\<open>{1,2,3}\<close> (sets), \<^boxed_theory_text>\<open>(1,2,3)\<close> (tuples),
\<^boxed_theory_text>\<open>\<forall> x. P(x) \<and> Q x = C\<close> (formulas). For comprehensive overview,
see~@{cite "nipkow:whats:2020"}.
see~\<^cite>\<open>"nipkow:whats:2020"\<close>.
\<close>
text\<open>
Advanced ontologies can, \<^eg>, use recursive function definitions with
pattern-matching~@{cite "kraus:defining:2020"}, extensible record
specifications~@{cite "wenzel:isabelle-isar:2020"}, and abstract type declarations.
pattern-matching~\<^cite>\<open>"kraus:defining:2020"\<close>, extensible record
specifications~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>, and abstract type declarations.
\<close>
text\<open>\<^isadof> works internally with fully qualified names in order to avoid confusions
@ -290,7 +290,7 @@ text\<open>
class representation, the identifier \<^boxed_latex>\<open>#1\<close> refers to the content of the main text of the
document class (written in \<^boxed_theory_text>\<open>\<open> ... \<close>\<close>) and the attributes can be referenced
by their name using the \<^ltxinline>\<open>\commandkey{...}\<close>-command (see the documentation of the
\<^LaTeX>-package ``keycommand''~@{cite "chervet:keycommand:2010"} for details). Usually, the
\<^LaTeX>-package ``keycommand''~\<^cite>\<open>"chervet:keycommand:2010"\<close> for details). Usually, the
representations definition needs to be wrapped in a
\<^ltxinline>\<open>\begin{isarmarkup}...\end{isamarkup}\<close>-environment, to ensure the correct context
within Isabelle's \<^LaTeX>-setup.
@ -353,7 +353,7 @@ text\<open>
as specified in @{technical \<open>odl_manual0\<close>}.
\<^item> \<open>meta_args\<close> :
\<^rail>\<open>obj_id ('::' class_id) ((',' attribute '=' HOL_term) *) \<close>
\<^item> \<^emph>\<open>evaluator\<close>: from @{cite "wenzel:isabelle-isar:2020"}, evaluation is tried first using ML,
\<^item> \<^emph>\<open>evaluator\<close>: from \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>, evaluation is tried first using ML,
falling back to normalization by evaluation if this fails. Alternatively a specific
evaluator can be selected using square brackets; typical evaluators use the
current set of code equations to normalize and include \<open>simp\<close> for fully
@ -433,7 +433,7 @@ term antiquotations:
will get the value of the \<^const>\<open>definition_list\<close> attribute of the instance \<open>odl-manual1\<close>.
\<^theory_text>\<open>value_\<close> may have an optional argument between square brackets to specify the evaluator:
\<^theory_text>\<open>@{value_ [nbe] \<open>definition_list @{technical \<open>odl-manual1\<close>}\<close>}\<close> forces \<open>value_\<close> to evaluate
the term using normalization by evaluation (see @{cite "wenzel:isabelle-isar:2020"}).
the term using normalization by evaluation (see \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>).
\<close>
(*<*)
@ -451,7 +451,7 @@ text*[cc_assumption_test_ref::cc_assumption_test]\<open>\<close>
definition tag_l :: "'a \<Rightarrow> 'b \<Rightarrow> 'b" where "tag_l \<equiv> \<lambda>x y. y"
lemma* tagged : "tag_l @{cc_assumption_test \<open>cc_assumption_test_ref\<close>} AA \<Longrightarrow> AA"
lemma* tagged : "tag_l @{cc_assumption_test \<open>cc_assumption_test_ref\<close>} AAA \<Longrightarrow> AAA"
by (simp add: tag_l_def)
find_theorems name:tagged "(_::cc_assumption_test \<Rightarrow> _ \<Rightarrow> _) _ _ \<Longrightarrow>_"
@ -495,7 +495,7 @@ The major commands providing term-contexts are\<^footnote>\<open>The meta-argume
Wrt. creation, checking and traceability, these commands are analogous to the ontological text and
code-commands. However the argument terms may contain term-antiquotations stemming from an
ontology definition. Term-contexts were type-checked and \<^emph>\<open>validated\<close> against
the global context (so: in the term @{term_ [source] \<open>@{author \<open>bu\<close>}\<close>}, \<open>bu\<close>
the global context (so: in the term @{term_ [source] \<open>@{scholarly_paper.author \<open>bu\<close>}\<close>}, \<open>bu\<close>
is indeed a string which refers to a meta-object belonging to the document class \<^typ>\<open>author\<close>,
for example). With the exception of the @{command "term*"}-command, the term-antiquotations
in the other term-contexts are additionally expanded (\<^eg> replaced) by the instance of the class,
@ -514,7 +514,7 @@ with the \<^boxed_theory_text>\<open>disable_assert_evaluation\<close> theory at
Then @{command "assert*"} will act like @{command "term*"}.
The @{command "definition*"}-command allows \<open>prop\<close>, \<open>spec_prems\<close>, and \<open>for_fixes\<close>
(see the @{command "definition"} command in @{cite "wenzel:isabelle-isar:2020"}) to contain
(see the @{command "definition"} command in \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>) to contain
term-antiquotations. For example:
@{boxed_theory_text [display]
\<open>doc_class A =
@ -525,7 +525,7 @@ definition*[a1::A, x=5, level="Some 1"] xx' where "xx' \<equiv> A.x @{A \<open>a
The @{term_ [source] \<open>@{A \<open>a1\<close>}\<close>} term-antiquotation is used both in \<open>prop\<close> and in \<open>spec_prems\<close>.
@{command "lemma*"}, @{command "theorem*"}, etc., are extended versions of the goal commands
defined in @{cite "wenzel:isabelle-isar:2020"}. Term-antiquotations can be used either in
defined in \<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>. Term-antiquotations can be used either in
a \<open>long_statement\<close> or in a \<open>short_statement\<close>.
For instance:
@{boxed_theory_text [display]
@ -671,7 +671,7 @@ doc_class text_element =
variants :: "String.literal set" <= "{STR ''outline'', STR ''document''}"
\<close>}
As mentioned in @{technical \<open>sss\<close>},
As mentioned in @{scholarly_paper.technical \<open>sss\<close>},
\<^const>\<open>level\<close> defines the section-level (\<^eg>, using a \<^LaTeX>-inspired hierarchy:
from \<^boxed_theory_text>\<open>Some -1\<close> (corresponding to \<^boxed_latex>\<open>\part\<close>) to
\<^boxed_theory_text>\<open>Some 0\<close> (corresponding to \<^boxed_latex>\<open>\chapter\<close>, respectively, \<^boxed_theory_text>\<open>chapter*\<close>)
@ -807,11 +807,11 @@ doc_class text_section = text_element +
Besides attributes of more practical considerations like a \<^const>\<open>fixme_list\<close>, that can be modified
during the editing process but is only visible in the integrated source but usually ignored in the
\<^LaTeX>, this class also introduces the possibility to assign an ``ownership" or ``responsibility" of
\<^LaTeX>, this class also introduces the possibility to assign an ``ownership'' or ``responsibility'' of
a \<^typ>\<open>text_element\<close> to a specific \<^typ>\<open>author\<close>. Note that this is possible since \<^isadof> assigns to each
document class also a class-type which is declared in the HOL environment.\<close>
text*[s23::example, main_author = "Some(@{author \<open>bu\<close>})"]\<open>
text*[s23::example, main_author = "Some(@{scholarly_paper.author \<open>bu\<close>})"]\<open>
Recall that concrete authors can be denoted by term-antiquotations generated by \<^isadof>; for example,
this may be for a text fragment like
@{boxed_theory_text [display]
@ -960,7 +960,7 @@ subsubsection\<open>For Isabelle Hackers: Defining New Top-Level Commands\<close
text\<open>
Defining such new top-level commands requires some Isabelle knowledge as well as
extending the dispatcher of the \<^LaTeX>-backend. For the details of defining top-level
commands, we refer the reader to the Isar manual~@{cite "wenzel:isabelle-isar:2020"}.
commands, we refer the reader to the Isar manual~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>.
Here, we only give a brief example how the \<^boxed_theory_text>\<open>section*\<close>-command is defined; we
refer the reader to the source code of \<^isadof> for details.
@ -1157,7 +1157,7 @@ text\<open>
The \<^boxed_theory_text>\<open>\<sigma>\<close> symbol is reserved and references the future instance class.
By relying on the implementation of the Records
in Isabelle/HOL~@{cite "wenzel:isabelle-isar:2020"},
in Isabelle/HOL~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>,
one can reference an attribute of an instance using its selector function.
For example, \<^boxed_theory_text>\<open>int1 \<sigma>\<close> denotes the value
of the \<^boxed_theory_text>\<open>int1\<close> attribute
@ -1370,6 +1370,23 @@ value*\<open>filter (\<lambda>\<sigma>. the (text_section.level \<sigma>) > 1) @
section*[infrastructure::technical]\<open>Technical Infrastructure\<close>
subsection\<open>The Previewer\<close>
figure*[
global_DOF_view::figure
, relative_width="95"
, file_src="''figures/ThisPaperWithPreviewer.png''"
]\<open>A Screenshot while editing this Paper in \<^dof> with Preview.\<close>
text\<open>A screenshot of the editing environment is shown in \<^figure>\<open>global_DOF_view\<close>. It supports
incremental continuous PDF generation which improves usability. Currently, the granularity
is restricted to entire theories (which have to be selected in a specific document pane).
The response times can not (yet) compete
with a Word- or Overleaf editor, though, which is mostly due to the checking and
evaluation overhead (the turnaround of this section is about 30 s). However, we believe
that better parallelization and evaluation techniques will decrease this gap substantially for the
most common cases in future versions. \<close>
subsection\<open>Developing Ontologies and their Representation Mappings\<close>
text\<open>
The document core \<^emph>\<open>may\<close>, but \<^emph>\<open>must\<close> not use Isabelle definitions or proofs for checking the
@ -1525,8 +1542,8 @@ text\<open>
Examples of this can be found, \<^eg>, in the ontology-style
\<^file>\<open>../../ontologies/scholarly_paper/DOF-scholarly_paper.sty\<close>.
For details about the expansion mechanism
in general, we refer the reader to the \<^LaTeX> literature (\<^eg>,~@{cite "knuth:texbook:1986"
and "mittelbach.ea:latex:1999" and "eijkhout:latex-cs:2012"}).
in general, we refer the reader to the \<^LaTeX> literature (\<^eg>,~\<^cite>\<open>"knuth:texbook:1986"
and "mittelbach.ea:latex:1999" and "eijkhout:latex-cs:2012"\<close>).
\<close>

View File

@ -12,9 +12,9 @@
*************************************************************************)
(*<*)
theory "M_05_Implementation"
imports "M_04_RefMan"
begin
theory "M_07_Implementation"
imports "M_06_RefMan"
begin
(*>*)
@ -33,7 +33,7 @@ text\<open>
\<close>
text\<open>
Semantic macros, as required by our document model, are called \<^emph>\<open>document antiquotations\<close>
in the Isabelle literature~@{cite "wenzel:isabelle-isar:2020"}. While Isabelle's code-antiquotations
in the Isabelle literature~\<^cite>\<open>"wenzel:isabelle-isar:2020"\<close>. While Isabelle's code-antiquotations
are an old concept going back to Lisp and having found via SML and OCaml their ways into modern
proof systems, special annotation syntax inside documentation comments have their roots in
documentation generators such as Javadoc. Their use, however, as a mechanism to embed
@ -210,13 +210,13 @@ text\<open>
possible;
otherwise, if \<^boxed_sml>\<open>next\<close> fails, an error is reported. The automata implementation
is, in large parts, generated from a formalization of functional automata
@{cite "nipkow.ea:functional-Automata-afp:2004"}.
\<^cite>\<open>"nipkow.ea:functional-Automata-afp:2004"\<close>.
\<close>
section\<open>The \<^LaTeX>-Core of \<^isadof>\<close>
text\<open>
The \<^LaTeX>-implementation of \<^isadof> heavily relies on the
``keycommand''~@{cite "chervet:keycommand:2010"} package. In fact, the core \<^isadof> \<^LaTeX>-commands
``keycommand''~\<^cite>\<open>"chervet:keycommand:2010"\<close> package. In fact, the core \<^isadof> \<^LaTeX>-commands
are just wrappers for the corresponding commands from the keycommand package:
@{boxed_latex [display]