Compare commits

...

475 Commits

Author SHA1 Message Date
Nicolas Méric fda02be889 Update dedukti-presentation example to Isabelle 2023 2023-10-10 09:54:49 +02:00
Nicolas Méric 989ab3c315 Update reification_test for Dedukti presentation 2023-10-10 08:51:53 +02:00
Nicolas Méric 7b54bf5ca5 Cleanup 2023-09-20 15:56:50 +02:00
Nicolas Méric baa36b10c1 Cleanup 2023-09-20 15:00:43 +02:00
Nicolas Méric c57ce6292b Update output type name for latex refs 2023-09-19 17:03:00 +02:00
Achim D. Brucker b698572146 Documented Isabelle version (2023). 2023-09-14 06:33:28 +01:00
Achim D. Brucker e12abadc94 Test with Isabelle 2023. 2023-09-14 06:29:01 +01:00
Achim D. Brucker 792fd60055 Merge branch 'main' into isabelle_nightly 2023-09-12 18:58:58 +01:00
Nicolas Méric ec7297f1d3 Update instances list term antiquotation
Make instances list term antiquotation compatible with
polymorphic classes
2023-09-11 09:07:10 +02:00
Achim D. Brucker e4ee3ff240 Merge branch 'main' into isabelle_nightly 2023-08-31 08:27:53 +01:00
Achim D. Brucker 4393042f2c Merge. 2023-08-29 08:09:28 +01:00
Achim D. Brucker fef7b9d60b Merge commit 'cef4086029' into isabelle_nightly 2023-08-29 06:40:37 +01:00
Achim D. Brucker ab7d695a77 Merge. 2023-08-29 06:37:33 +01:00
Achim D. Brucker c063287947 Isabelle API update. 2023-08-29 06:11:32 +01:00
Achim D. Brucker 342984df3b Converted def into newcommand. 2023-08-04 07:01:42 +01:00
Achim D. Brucker 5a8e79fb7e Moved default value for title into template, as some LaTeX classes do not allow for a pre-set title. 2023-08-04 04:37:14 +01:00
Achim D. Brucker d7f9f10ef1 Merge commit 'b4f1b8c32177ce5af37357fc4a7ab0df22a497d6' into isabelle_nightly 2023-08-03 03:41:43 +01:00
Achim D. Brucker 0a3259fbca Merge commit '59b082d09d55d55ef6c6f8bd8e821122dddf3574' into isabelle_nightly 2023-08-03 03:35:29 +01:00
Nicolas Méric ca7cdec9b4 Fix typos 2023-07-20 16:31:08 +02:00
Nicolas Méric 43aad517b9 Add basic explanation for lemma*, etc.
Add basic explanation how to use lemma*, etc.
with term antiquotations of polymorphic class instances
2023-07-20 16:25:25 +02:00
Nicolas Méric 8d6c8929e2 Fix typos 2023-07-20 16:14:25 +02:00
Nicolas Méric b447a480fb Fix manual latex compilation 2023-07-20 15:04:39 +02:00
Nicolas Méric a78397693e Update instances term antiquotation in manual 2023-07-20 14:32:03 +02:00
Nicolas Méric 9812bc0517 Use binding for instances name 2023-07-20 10:11:48 +02:00
Nicolas Méric b364880bfc Polymorphic classes first draft 2023-07-19 18:58:04 +02:00
Nicolas Méric 5a7cbf2da5 Add file checking in figure_content 2023-06-20 11:05:11 +02:00
Nicolas Méric 7f7780f8fd Update restriction of RegExpInterface notations to onto class definition 2023-06-19 19:10:21 +02:00
Nicolas Méric 889805cccc Add basic block environment support for beamer 2023-06-19 09:19:28 +02:00
Nicolas Méric 5a07aa2453 Delete useless tests 2023-06-16 18:37:00 +02:00
Nicolas Méric cef4086029 Add basic support for beamer frame options and add a figure_content antiquotation 2023-06-16 11:54:33 +02:00
Nicolas Méric 9df276ac6f Add first beamer frame implementation in SML 2023-06-15 16:07:08 +02:00
Nicolas Méric b4f1b8c321 Fix ECs latex list of tables 2023-06-06 19:03:20 +02:00
Nicolas Méric 59b082d09d Handle "_" and "'" in mixfix to be compatible with inner syntax names 2023-06-06 16:44:11 +02:00
Achim D. Brucker 1869a96b2d API update. 2023-06-04 12:01:38 +02:00
Achim D. Brucker e95c6386af Merge branch 'main' into isabelle_nightly 2023-06-04 10:20:13 +02:00
Achim D. Brucker 23a85cc8c2 Minor tuning of beamer-related examples. 2023-06-01 17:58:55 +02:00
Achim D. Brucker ddcfb5f708 Initial commit: stubs for using beamer. 2023-06-01 00:21:19 +02:00
Achim D. Brucker 02d13cdcad Fixed release script. 2023-05-27 21:48:04 +02:00
Achim D. Brucker d353ff07cc Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-05-27 20:56:56 +02:00
Achim D. Brucker 38035785da Fixed SRAC definition. 2023-05-27 20:56:47 +02:00
Achim D. Brucker 7e7c197ac3 Merge branch 'main' into isabelle_nightly 2023-05-25 11:42:31 +02:00
Nicolas Méric 4f8e588138 Document disable_assert_evaluation theory atttribute in the manual 2023-05-24 14:17:16 +02:00
Nicolas Méric 2c0b51779e Add the possibility to disable evaluation for assert* 2023-05-24 12:38:29 +02:00
Nicolas Méric 350ff6fe76 Make class invariants long-names unique
Now class invariants names use internally the class name
as a user Binding.qualifier.
This way one can use the same name for an invariant
in two different classes in the same theory:

doc_class "hypothesis"  = math_content +
   referentiable :: bool <= "True"
   level         :: "int option"         <= "Some 2"
   mcc           :: "math_content_class" <= "hypt"
   invariant d :: "mcc σ = hypt"

doc_class "math_proof"  = math_content +
   referentiable :: bool <= "True"
   level         :: "int option"         <= "Some 2"
   mcc           :: "math_content_class" <= "prf_stmt"
   invariant d :: "mcc σ = prf_stmt"

find_consts name:"math_proof.d_inv"
find_consts name:"hypothesis.d_inv"
2023-05-23 14:44:16 +02:00
Achim D. Brucker c803474950 Merge branch 'main' into isabelle_nightly 2023-05-19 16:19:00 +02:00
Achim D. Brucker e17f09e624 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-05-19 16:16:30 +02:00
Achim D. Brucker 8051d4233e Ensure compatibility with TeX Live 2019 (as used by AFP's build servers). 2023-05-17 13:57:35 +02:00
Nicolas Méric b4b63ce989 Add subcaption package to sn-article template 2023-05-17 12:42:16 +02:00
Achim D. Brucker 2dc16b263f Removed root.tex (bug). 2023-05-17 12:19:30 +02:00
Achim D. Brucker 5754bb4adc Added chapter AFP and timeout. 2023-05-17 09:17:17 +02:00
Achim D. Brucker c7debc577b Moved src formats into subfolder and removed them from ROOT file. 2023-05-17 09:16:41 +02:00
Achim D. Brucker 9c94593f45 Removed unused files. 2023-05-17 06:39:44 +02:00
Nicolas Méric 4d89250606 Restrict RegExpInterface notations to onto class definition 2023-05-16 12:27:19 +02:00
Achim D. Brucker 3f06320034 Merge branch 'main' into isabelle_nightly 2023-05-15 17:56:39 +02:00
Achim D. Brucker 49faed4faf Disabled PDF generation for currently not supported references. 2023-05-15 17:55:52 +02:00
Achim D. Brucker 1a22441f3e Merge branch 'main' into isabelle_nightly 2023-05-15 14:31:01 +02:00
Achim D. Brucker df1b2c9904 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-05-15 14:29:11 +02:00
Achim D. Brucker 9064cd3f62 Include CENELEC 50128 terminology. 2023-05-15 14:28:56 +02:00
Nicolas Méric f5b8d4348b Update mini-odo example references 2023-05-15 13:16:40 +02:00
Achim D. Brucker d225a3253c Fixed typo. 2023-05-15 13:03:52 +02:00
Achim D. Brucker 2ee0bc5074 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-05-15 13:02:52 +02:00
Achim D. Brucker 9683ea7efa Ad-hoc fix of undefined references. 2023-05-15 13:02:49 +02:00
Burkhart Wolff bce097b1d6 Commenting out refs to definitionSTAR 2023-05-15 13:02:41 +02:00
Nicolas Méric 65d6fb946d Update unchecked references 2023-05-15 12:23:31 +02:00
Achim D. Brucker 060f2aca89 Merge branch 'main' into isabelle_nightly 2023-05-15 10:50:08 +02:00
Nicolas Méric ba7c0711a8 Update documentation and some refactoring 2023-05-15 10:48:40 +02:00
Achim D. Brucker 4adbe4ce81 Merge branch 'main' into isabelle_nightly 2023-05-15 10:20:12 +02:00
Achim D. Brucker 7e698a9e69 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-05-15 10:16:34 +02:00
Achim D. Brucker 2569db05c3 Pushed raggedbottom into templates. 2023-05-15 10:16:31 +02:00
Nicolas Méric cd311d8a3a Update firgure* implementation 2023-05-15 09:36:02 +02:00
Achim D. Brucker fb69f05ac0 Merge pull request 'idir-remarks' (#30) from idir-remarks into main
Reviewed-on: Isabelle_DOF/Isabelle_DOF#30
2023-05-15 06:34:49 +00:00
Achim D. Brucker 1986d0bcbd Merge branch 'main' into idir-remarks 2023-05-15 06:34:34 +00:00
Achim D. Brucker bbac65e233 Proof reading. 2023-05-15 08:30:33 +02:00
Achim D. Brucker 9cd34d7815 Run latexmk in error mode for checking for undefined references and other errors after build. 2023-05-15 07:35:11 +02:00
Achim D. Brucker 641bea4a58 Improved documentation and fixed width-bug of figure* macro. 2023-05-15 00:01:30 +02:00
Burkhart Wolff d0cd28a45c eliminated side_by_side figure, actualized refman. 2023-05-14 17:35:00 +02:00
Burkhart Wolff db4290428f ... 2023-05-13 18:22:27 +02:00
Burkhart Wolff 43da6d3197 ... 2023-05-13 18:20:29 +02:00
Achim D. Brucker a93046beac Merge. 2023-05-13 00:09:44 +02:00
Nicolas Méric b8282b771e Cleanup 2023-05-12 20:04:44 +02:00
Burkhart Wolff 1cfc4ac88a Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-05-12 17:50:50 +02:00
Burkhart Wolff e9044e8d5a Updating odo 2023-05-12 17:50:42 +02:00
Achim D. Brucker 6bab138af6 Removed default author. 2023-05-12 17:50:17 +02:00
Achim D. Brucker fcc25f7450 Removed implementation of figure* and side_by_side_figure. 2023-05-12 17:47:18 +02:00
Burkhart Wolff e97cca1a2c reactivated Cenelec_Test 2023-05-12 17:17:57 +02:00
Burkhart Wolff 33fd1453a0 Global remove of side-by-side-figures, fixing various bugs - Caveat: no correspondance figure* - class figure. 2023-05-12 17:04:30 +02:00
Burkhart Wolff 543c647bcc Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-05-12 16:19:29 +02:00
Burkhart Wolff f7141f0df8 debugging the LaTeX generation for COL 2023-05-12 16:19:14 +02:00
Burkhart Wolff 514ebee17c pass on new figure implemntation 2023-05-12 15:11:37 +02:00
Burkhart Wolff bdc8477f38 Code Cleanup 2023-05-12 09:42:52 +02:00
Nicolas Méric 7e01b7de97 Implement long names for classes term-antiquotataions
Examples with value* that now work:

value*‹@{scholarly-paper.author ‹church'›}›
value*‹@{author ‹church›}›
value*‹@{Concept-High-Level-Invariants.author ‹church›}›

value*‹@{scholarly-paper.author-instances}›
value*‹@{author-instances}›
value*‹@{Concept-High-Level-Invariants.author-instances}›
2023-05-11 19:02:55 +02:00
Burkhart Wolff 8bdd40fc20 basic problems on multiple subfloats content solved 2023-05-11 16:21:37 +02:00
Idir Ait-Sadoune 9cc03c0816 Idir remarks for the intrduction of the manual. 2023-05-11 13:18:12 +02:00
Idir Ait-Sadoune e9cfcdbcbc Idir remarks for the abstract of the manual. 2023-05-11 12:48:49 +02:00
Burkhart Wolff 36740bf72b debugging fig_content 2023-05-11 11:48:05 +02:00
Burkhart Wolff b8da1a304a Improved fig_content, fix backend bugs in COL_Test 2023-05-10 18:31:27 +02:00
Burkhart Wolff 5b519fcbe6 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-05-10 15:54:17 +02:00
Burkhart Wolff 50da7670cf Some repair on the coherence problems in COL 2023-05-10 15:54:02 +02:00
Achim D. Brucker 09d1b27f10 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-05-10 15:21:35 +02:00
Achim D. Brucker 34e23b314f Overwrite checks by scholarly paper. 2023-05-10 15:21:29 +02:00
Burkhart Wolff 0aa9f1ff25 renamed figure2 into float 2023-05-10 12:37:29 +02:00
Achim D. Brucker 3f8fc4f16f Tuning. 2023-05-10 11:13:45 +02:00
Achim D. Brucker b62b391410 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-05-10 10:40:44 +02:00
Achim D. Brucker 41a4f38478 Initial support for Springer Nature's LaTeX template. 2023-05-10 10:40:19 +02:00
Burkhart Wolff ca8671ee1c a version with @{fig_content in the test 2023-05-09 23:12:50 +02:00
Burkhart Wolff 9e210b487a a version with @{fig_content in the test 2023-05-09 23:08:57 +02:00
Burkhart Wolff 6317294721 layout trimming 2023-05-09 22:37:41 +02:00
Burkhart Wolff 762680a20c eliminated calamity with tick symbol, layout imprivements, eliminated docitem 2023-05-09 20:18:33 +02:00
Burkhart Wolff 850244844b eliminated calamity with tick symbol, layout imprivements, eliminated docitem 2023-05-09 20:17:00 +02:00
Burkhart Wolff 322d70ef69 deleting subparagraph (never used),orienting Example-I on figure2. 2023-05-09 16:15:47 +02:00
Burkhart Wolff b04ff7e31a Some first test on the COL library, assuring coherence between text* and figure* versiona. 2023-05-09 12:59:42 +02:00
Burkhart Wolff 7ba220e417 LaTeX sty Bug xrt figure2 2023-05-09 12:16:37 +02:00
Burkhart Wolff 713a24615f LaTeX sty Bug xrt figure2 2023-05-09 12:13:23 +02:00
Burkhart Wolff 7ffdcbc569 experiment with figure2 2023-05-09 04:14:57 +02:00
Achim D. Brucker 43ce393e4a Merge branch 'main' into isabelle_nightly 2023-05-07 17:43:31 +01:00
Burkhart Wolff 4326492b39 false box 2023-05-06 15:55:22 +02:00
Burkhart Wolff 1e7f6a7c18 trimming, putting the begin-figure blocks in independent text elements 2023-05-06 15:15:53 +02:00
Achim D. Brucker a087e94ebe Merge branch 'main' into isabelle_nightly 2023-05-05 06:18:51 +01:00
Achim D. Brucker 78cb606268 Removed mkroot example, which is only available when using Isabelle/DOF as a proper Isabelle component. 2023-05-04 14:20:40 +01:00
Achim D. Brucker c40a5a74c1 Ad-hoc conversion of listing-environments (LaTeX) to boxed-antiquotations. 2023-05-04 14:11:32 +01:00
Achim D. Brucker fc214fc391 Merge branch 'main' into isabelle_nightly 2023-05-03 11:57:59 +01:00
Burkhart Wolff f613811154 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-05-03 10:55:44 +02:00
Burkhart Wolff 4c66716999 experiments with boxes 2023-05-03 10:55:37 +02:00
Achim D. Brucker 639abb6cf5 Removed not used listings setup. 2023-05-02 23:06:02 +01:00
Achim D. Brucker 2c00f4b8db Synchronised updates. 2023-05-02 22:34:34 +01:00
Burkhart Wolff d9e2f251d2 Kleinkram 2023-05-02 12:21:48 +02:00
Burkhart Wolff cec21c9935 kicked out inlineisar 2023-05-02 11:37:03 +02:00
Achim D. Brucker 640a867f28 Port to Isabelle Nightly. 2023-04-28 15:00:10 +01:00
Achim D. Brucker 0c654e2634 Pull image for build ... 2023-04-28 11:21:13 +01:00
Achim D. Brucker 01bcc48c79 Fixing repo location in container (Fixes #26). 2023-04-28 11:20:23 +01:00
Achim D. Brucker c3aaaf9ebb Force pull of container and print latest log from Isabelle repo. 2023-04-28 07:33:24 +01:00
Achim D. Brucker 47e8fc805f Merge branch 'main' into Isabelle_dev 2023-04-27 14:54:52 +01:00
Achim D. Brucker 02bf9620f6 Changed registry. 2023-04-27 14:54:22 +01:00
Nicolas Méric 18be1ba5f5 Clean up dead code 2023-04-27 15:16:47 +02:00
Nicolas Méric 93c722a41b Update malformed theory names
Theory names should use Isabelle inner syntax to allow
objects referencing using long names.
For inner syntax, see the isar-ref manual
about syntax category "longid",
which is the same as "long_ident" of outer syntax
(but not "name" or "system_name").
2023-04-27 14:53:17 +02:00
Nicolas Méric 0f48f356df Fix sml latex environment issue with "$" 2023-04-27 14:53:17 +02:00
Achim D. Brucker 870a4eec57 Merge branch 'main' into Isabelle_dev 2023-04-27 13:35:40 +01:00
Achim D. Brucker 4df233e9f4 Updated image name. 2023-04-27 13:35:11 +01:00
Burkhart Wolff 5d7b50ca7f Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-04-27 13:03:07 +02:00
Burkhart Wolff 1ebfaccb50 amended explication of examples. 2023-04-27 13:02:57 +02:00
Burkhart Wolff 7ce3fdf768 added LNCS number to ABZ paper 2023-04-27 13:02:37 +02:00
Burkhart Wolff db130bd6ce ... 2023-04-26 15:31:14 +02:00
Achim D. Brucker 496a850700 Merge branch 'main' into Isabelle_dev 2023-04-26 08:37:45 +01:00
Achim D. Brucker 101f96a261 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-04-26 08:16:39 +01:00
Achim D. Brucker 49aa29ee68 Normalised LaTeX command names. 2023-04-26 08:16:32 +01:00
Burkhart Wolff 2919f5d2a5 animation over ontologies vs. meta-language 2023-04-26 07:14:46 +02:00
Burkhart Wolff 6cafcce536 boxed sml preserves now $. 2023-04-25 22:05:33 +02:00
Burkhart Wolff ebce149d6a ... 2023-04-25 17:50:05 +02:00
Burkhart Wolff 6984b9ae03 minor stuff 2023-04-25 17:09:36 +02:00
Burkhart Wolff 74e2341971 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-04-25 16:21:05 +02:00
Burkhart Wolff 16caefc7be revision/restructuring > pp 40 2023-04-25 16:20:58 +02:00
Achim D. Brucker 0d74645d2e Merge and upgrade to development version of Isabelle/HOL. 2023-04-24 22:26:39 +01:00
Burkhart Wolff f906d45d48 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-04-24 12:15:27 +02:00
Burkhart Wolff 761a336a7a Nicos improvements. 2023-04-24 12:05:04 +02:00
Nicolas Méric b3f396fb08 Fix abstract \isadof macro name 2023-04-20 14:55:27 +02:00
Burkhart Wolff 77aeb3b7ca ... 2023-04-20 14:29:38 +02:00
Burkhart Wolff 81208f73a8 more thorough reference tests .... 2023-04-20 14:29:25 +02:00
Burkhart Wolff f093bfc961 ... 2023-04-20 11:46:50 +02:00
Burkhart Wolff 2c7df482e8 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-04-20 09:47:30 +02:00
Burkhart Wolff c9de5f2293 put ltxinline into macro notation 2023-04-20 09:47:22 +02:00
Nicolas Méric c6dc848438 Some cleanup 2023-04-20 08:30:09 +02:00
Burkhart Wolff 1acf863845 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-04-19 21:50:51 +02:00
Burkhart Wolff a6aca1407e added sec on term atq, restructuring 2023-04-19 21:50:43 +02:00
Burkhart Wolff 4c953fb954 revised sec 3 2023-04-19 21:17:07 +02:00
Nicolas Méric 77e8844687 Fix Cenelec test build error 2023-04-19 15:57:43 +02:00
Nicolas Méric 939715aba9 Fix scholarly_paper 2023-04-19 15:53:31 +02:00
Burkhart Wolff d809211481 revision to 2 completed, still todo'ds in 3 and 4 and beyond 2023-04-19 13:17:26 +02:00
Achim D. Brucker 480272ad86 Merge branch 'main' into Isabelle_dev 2023-04-16 08:45:16 +01:00
Achim D. Brucker d277fa2aed Updated READMEs after session renaming. 2023-04-15 16:55:15 +01:00
Achim D. Brucker 9318ea55a0 Fixed archive building after session renaming. 2023-04-15 16:52:25 +01:00
Achim D. Brucker 3408b90f89 Added autoref names. 2023-04-15 13:16:14 +01:00
Burkhart Wolff dd0a9981a3 LaTeX bug fixed, little optimizations 2023-04-15 10:30:04 +02:00
Achim D. Brucker e549bcb23c Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-04-14 21:23:23 +01:00
Achim D. Brucker 04c8c8d150 Fixed setup for mathematical concepts. 2023-04-14 21:04:08 +01:00
Achim D. Brucker a5885b3eb5 Fixed ref/label setup. 2023-04-14 20:56:43 +01:00
Achim D. Brucker 4cdb6d725b Use DOF-CC_terminology.sty. 2023-04-14 20:55:23 +01:00
Achim D. Brucker 486ae2db97 Initial commit. 2023-04-14 20:54:45 +01:00
Burkhart Wolff fb8da62182 minor polishing 2023-04-14 14:46:15 +02:00
Burkhart Wolff 6c588c3fe4 added diag 'integrated document' 2023-04-14 10:41:14 +02:00
Burkhart Wolff 3ab6f665eb rearranging the story in Background 2023-04-13 22:00:35 +02:00
Burkhart Wolff 0c8bc2cab3 new high-level presentations in background 2023-04-13 18:29:10 +02:00
Burkhart Wolff 20ac16196a ... 2023-04-12 14:25:48 +02:00
Burkhart Wolff d62cd04e26 alphabetic order of authors 2023-04-12 13:48:24 +02:00
Burkhart Wolff 96d20c127f pass over Background 2023-04-12 13:46:05 +02:00
Burkhart Wolff 394189e9e0 twiddle of text in jedit removed. 2023-04-12 13:14:00 +02:00
Burkhart Wolff 1f79e37d9b pass over Background 2023-04-12 13:11:09 +02:00
Burkhart Wolff b43de570a4 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-04-12 10:34:56 +02:00
Burkhart Wolff debddc45d2 diverse modifs. 2023-04-12 10:34:41 +02:00
Burkhart Wolff 3de5548642 reset 2023-04-11 23:17:32 +02:00
Burkhart Wolff 4157954506 revision of front, intro and bachgrnd (incomplete) 2023-04-11 23:15:32 +02:00
Burkhart Wolff 25473b177b added (incomplete) ref to ABZ paper 2023-04-11 23:14:33 +02:00
Nicolas Méric 36cd3817cf Quick fix for text* macros latex output 2023-04-11 18:52:57 +02:00
Burkhart Wolff cb2b0dc230 ... 2023-04-06 15:23:55 +02:00
Burkhart Wolff c82a3a7e70 restructuring with iFM2020 as own AFP component 2023-04-06 13:48:38 +02:00
Burkhart Wolff 8c6abf2613 ... 2023-04-05 16:46:21 +02:00
Achim D. Brucker 07444efd21 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-03-30 17:26:44 +01:00
Achim D. Brucker c203327191 Optimized dispatcher. 2023-03-30 17:26:39 +01:00
Nicolas Méric a90202953b Use instance long-names for latex labels and references generation 2023-03-30 17:15:35 +02:00
Achim D. Brucker 698e6ab169 Bug fix: document variants. 2023-03-29 22:21:44 +01:00
Achim D. Brucker 320614004e Improved LaTeX support for Lemma*, Theorem*, Definition*, etc. 2023-03-29 22:21:22 +01:00
Burkhart Wolff 91ff9c67af repaired some obvious errors in sty - still incomplete 2023-03-29 11:41:30 +02:00
Burkhart Wolff 1838baecb9 some revision of ITP paper 2023-03-28 09:54:16 +02:00
Nicolas Méric ef29a9759f Some clean-up 2023-03-27 10:39:29 +02:00
Nicolas Méric 5336e0518f Allow standard Isabelle name pattern for instances name 2023-03-27 10:00:10 +02:00
Burkhart Wolff accc4f40b4 Improved Testset for new ontology elements 2023-03-26 20:58:55 +02:00
Burkhart Wolff bbb4b1749c restructured ontology; added a family of new macros for support 2023-03-26 20:57:58 +02:00
Burkhart Wolff 4ba0c705b4 deactivated CENELEC in tests (nothing tested, just time consumed) 2023-03-26 20:56:54 +02:00
Burkhart Wolff 5d89bcc86a added some demonstrations/tests 2023-03-25 10:49:50 +01:00
Burkhart Wolff 07527dbe11 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-03-24 17:22:49 +01:00
Burkhart Wolff c0dc60d49e Enlarged Free-form Section 2023-03-24 17:22:44 +01:00
Burkhart Wolff 81a50c6a9e Reactivating failing assertions 2023-03-24 17:21:36 +01:00
Burkhart Wolff 5628eaa2dc Code Cleanup 2023-03-24 17:20:45 +01:00
Nicolas Méric 230247de1a Update Manual and code
- Update term context section
- Add option to define a default class for declare_reference*
- Use defined symbol identifiers \<quote> and \<doublequote>
  to simplify caveat section about lexical conventions
- Rename Manual theories to avoid issues
  when using Syntax.parse_term that is not compatible with
  with long-names staring with a number or an underscore
- Rewrite names used as mixfix annotation
  for the term-antiquotations to rule out
  mixform form excluded symbols
2023-03-24 17:02:24 +01:00
Burkhart Wolff 0834f938a9 code cleanup 2023-03-24 12:59:54 +01:00
Burkhart Wolff 63c2acfece improved title setup for testSuite 2023-03-24 10:41:32 +01:00
Burkhart Wolff 3a4db69184 updated Evaluation Section 2023-03-24 08:28:14 +01:00
Burkhart Wolff 3fc4688f69 updated Evaluation Section 2023-03-24 08:13:51 +01:00
Burkhart Wolff 7dbd016b5d Pass throúgh evaluations 2023-03-24 08:08:55 +01:00
Burkhart Wolff 3b446c874d Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-03-21 14:33:25 +01:00
Burkhart Wolff 4de23de5ee ... 2023-03-21 14:33:21 +01:00
Nicolas Méric 4bd31be71d Remove obsolete termrepr term anti-quotation
- Also some clean-up
2023-03-20 16:50:23 +01:00
Nicolas Méric 826fc489b7 Fix wrong getters and mappings naming 2023-03-17 21:09:28 +01:00
Nicolas Méric ddcbf76353 Factorize ML invariants namespaces 2023-03-17 19:10:45 +01:00
Nicolas Méric 5ad6c0d328 Add getters and mappings for name-spaced objects 2023-03-17 14:05:05 +01:00
Nicolas Méric 34d5a194ee Some clean-up 2023-03-16 16:31:19 +01:00
Nicolas Méric 8b09b0c135 Some clean-up 2023-03-16 16:05:46 +01:00
Achim D. Brucker 5292154687 Converted é to \'e to work around the lack of first-class unicode support. 2023-03-15 12:15:08 +00:00
Achim D. Brucker caf966e3df Cleanup. 2023-03-15 11:22:36 +00:00
Achim D. Brucker 6a1343fd06 Spell checking. 2023-03-15 10:52:23 +00:00
Achim D. Brucker a7db5cc344 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-03-15 10:49:02 +00:00
Nicolas Méric de94ef196f Process input_term only when object_value_debug is enabled 2023-03-15 11:37:38 +01:00
Nicolas Méric c791be2912 Add monitor tests
- Add tests for monitors spanning two theories.
- Fix monitors trace update bug.
  When updating a monitor trace when we define a new instance,
  the monitor instance is already defined.
  But we can not update the instance using the update_instance function
  because  this function needs a binding, i.e. a short name,
  and then it will update or define a new instance if we want
  to update a monitor in a super theory whose name is the same as
  a monitor defined in the current theory.
  Example:

  in the super theory:

  doc_class monitor_M =
  tmM :: int
  rejects "test_monitor_A"
  accepts "test_monitor_head ~~ test_monitor_B ~~ test_monitor_C"

  open_monitor*[test_monitor_M::monitor_M]

  in the current theory:

  doc_class monitor_M =
  tmM :: int
  rejects "test_monitor_B"
  accepts "test_monitor_E ~~ test_monitor_C"

  text*[test_monitor_head2::Concept_MonitorTest1.test_monitor_head]‹›
  open_monitor*[test_monitor_M3::monitor_M]
  ...
  ==> ERROR : the instantiation of test_monitor_head2
              will define a new instance current.test_monitor_M3
              when updating the trace of super.test_monitor_M3

  Hence we use the update_instance_entry function
  which uses long names and only updates the entry.
2023-03-15 11:02:18 +01:00
Achim D. Brucker 44528e887d Documented limitation on using Isabelle/DOF via 'sideloading' partial sessions. 2023-03-14 23:23:23 +00:00
Achim D. Brucker b3097eaa79 Merge and upgrade to development version of Isabelle/HOL. 2023-03-13 15:19:06 +00:00
Achim D. Brucker ecb1e88b78 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-03-13 13:01:07 +00:00
Achim D. Brucker 75b39bc168 Build document build engine also for main Isabelle/DOF component. 2023-03-13 13:00:49 +00:00
Nicolas Méric dde865520a Disable invariants checking for declare_reference* without meta args 2023-03-13 11:31:48 +01:00
Nicolas Méric 37afd975b3 Fix thm and file anti-quotations short name bug 2023-03-13 10:27:31 +01:00
Burkhart Wolff d2a1808fa8 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-03-08 12:08:37 +01:00
Burkhart Wolff 94543a86e4 added value-assert to TestKit, improved Concept_TermAntiquotations. Still TODO's. 2023-03-08 12:08:33 +01:00
Burkhart Wolff af096e56fc value-assert-error added 2023-03-08 08:50:19 +01:00
Burkhart Wolff 68c1046918 Code simplification 2023-03-08 08:17:08 +01:00
Achim D. Brucker 1229db1432 Ensure that output is written within session directory. 2023-03-06 23:23:23 +01:00
Nicolas Méric 3670d30ddf Fix declarations in traces bug 2023-03-06 17:47:44 +01:00
Burkhart Wolff 542c38a89c started revision 2023-03-06 17:13:27 +01:00
Nicolas Méric b96302f676 Add latex commands to print value_ and term_ 2023-03-06 17:12:32 +01:00
Burkhart Wolff f60aebccb3 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-03-06 16:54:14 +01:00
Burkhart Wolff 224a320165 ... 2023-03-06 16:53:57 +01:00
Nicolas Méric 92e7ee017a Fix display option 2023-03-06 16:14:23 +01:00
Burkhart Wolff 8e4ac3f118 corrected bugs. 2023-03-06 15:08:08 +01:00
Burkhart Wolff 9fae991ea0 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-03-06 14:00:34 +01:00
Burkhart Wolff 6e5fa2d91b added tests with references from and to terms and code 2023-03-06 14:00:29 +01:00
Nicolas Méric b1a0d5d739 Fix non unchecked text class anti-quotation 2023-03-06 13:10:20 +01:00
Nicolas Méric 10b90c823f Fix declare_reference behavior
- Fix "unchecked" text onto_class antiqutation option
- Update text-assert-error function to make meta-arguments optional
2023-03-06 12:20:58 +01:00
Nicolas Méric ef8ffda414 Refactor ML invariants checking 2023-03-06 08:46:41 +01:00
Achim D. Brucker 69485fd497 Added hint on how to build the session Isabelle_DOF-Proofs. 2023-03-05 23:18:47 +00:00
Achim D. Brucker f29d888068 Markdown cleanup. 2023-03-05 23:18:22 +00:00
Achim D. Brucker cc805cadbe Merged updates from main and ported them to Isabelle's development version. 2023-03-05 10:29:16 +00:00
Achim D. Brucker 5bf0b00fbc Fixed string comparision for /bin/sh. 2023-03-04 19:17:05 +00:00
Achim D. Brucker cc3e6566ca Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-03-04 14:51:56 +00:00
Achim D. Brucker c297b5cddd Make quick_and_dirty mode fail builds. 2023-03-04 14:51:32 +00:00
Achim D. Brucker 47c6ce78be Enabeling build of Isabelle_DOF-Proofs session. 2023-03-04 14:51:05 +00:00
Burkhart Wolff 48c6457f63 Code Cleanup. 2023-03-04 13:58:51 +01:00
Burkhart Wolff ef3eee03c9 extended testkit by declare tester, added consistency proofs for OntoMatching. 2023-03-04 13:55:32 +01:00
Burkhart Wolff 853158c916 Code cleanup 2023-03-04 10:12:05 +01:00
Burkhart Wolff 280feb8653 improved testKit, finished Concept_Example_Low_Level invariant 2023-03-04 09:57:14 +01:00
Nicolas Méric 709187d415 Fix ML invariants bug for monitors 2023-03-03 18:39:35 +01:00
Nicolas Méric 289d47ee56 Fix ML invariants bug
- The ML invariants are not checked anymore. Fix it
2023-03-03 17:33:46 +01:00
Achim D. Brucker 9c324fde70 Qualified image URL. 2023-03-03 15:22:51 +00:00
Achim D. Brucker 22abad9026 Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-03-03 14:29:04 +00:00
Nicolas Méric 40e7285f0a Fix definition* test in Concept_OntoReferencing 2023-03-03 11:55:02 +01:00
Achim D. Brucker 3b33166f55 Added instructions for installing the AFP. 2023-03-03 05:47:11 +00:00
Burkhart Wolff 0f3beb846e Further advances in a more serious test setup 2023-03-02 18:13:15 +01:00
Nicolas Méric 8e6cb3b991 Add specification commands first draft
- Add definition* command
- Add theorem*, lemma*, corollary*, proposition* and schematic_goal*
  commands
2023-03-02 14:44:04 +01:00
Achim D. Brucker baf1d1b629 Check for sessions with quick_and_dirty mode enabled. 2023-03-02 08:43:57 +00:00
Achim D. Brucker de4c7a5168 Added warning mode. 2023-03-02 08:41:33 +00:00
Achim D. Brucker 6fe23c16be Removed quick_and_dirty mode. 2023-03-02 08:41:01 +00:00
Achim D. Brucker 113b3e79bf Merge. 2023-03-02 08:06:21 +00:00
Achim D. Brucker daea6333f1 Make dangling theories break the build. 2023-03-02 00:23:23 +00:00
Achim D. Brucker 53867fb24f Fixed CC example and integrated it into session hierarchy. 2023-03-02 00:23:23 +00:00
Burkhart Wolff 0f5e7f582b LaTeX repairs 2023-03-01 23:16:38 +01:00
Burkhart Wolff 0b256adee9 Bug in Test ROOT 2023-03-01 23:00:09 +01:00
Burkhart Wolff cbd197e4d8 Deeper checking in Ontological Referencing 2023-03-01 22:57:27 +01:00
Burkhart Wolff 5411aa4d6b Updating Ontological Referencing Tests 2023-03-01 22:18:48 +01:00
Burkhart Wolff 1895d3b52c Updating Ontological Referencing Tests 2023-03-01 22:17:32 +01:00
Burkhart Wolff 5bee1fee8f Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2023-03-01 20:48:04 +01:00
Burkhart Wolff a64fca4774 ground for revision of tests: TestKit, Conceptual, Latex-tests 2023-03-01 20:47:47 +01:00
Burkhart Wolff bf4c3d618e ground for revision of tests: TestKit, Conceptual, Latex-tests 2023-03-01 20:47:28 +01:00
Achim D. Brucker 684a775b07 Merge branch 'main' into Isabelle_dev 2023-03-01 11:53:53 +00:00
Achim D. Brucker 9fe7b26a35 Fixed unicode characters. 2023-03-01 11:41:31 +00:00
Nicolas Méric 511c6369dd Fix High_Level_Syntax_Invariants unit tests 2023-03-01 12:10:47 +01:00
Achim D. Brucker 2cb9156488 Integrated session for cytology example. 2023-03-01 10:49:54 +00:00
Achim D. Brucker ef87b1d81c Merge branch 'main' of git.logicalhacking.com:Isabelle_DOF/Isabelle_DOF 2023-03-01 10:46:22 +00:00
Nicolas Méric 5b7a50ba5c Fix Cytology example 2023-03-01 11:38:43 +01:00
Achim D. Brucker 69808755da Added status message after successful check. 2023-03-01 10:31:30 +00:00
Achim D. Brucker da6bc4277d Added new dependency: Metalogic_ProofChecker 2023-03-01 10:19:29 +00:00
Achim D. Brucker 229f7c49de Merge branch 'main' into Isabelle_dev 2023-03-01 09:26:16 +00:00
Achim D. Brucker 3aa1b45837 Print status. 2023-03-01 09:25:10 +00:00
Achim D. Brucker 990c6f7708 Renaming. 2023-03-01 09:24:09 +00:00
Achim D. Brucker 14dd368cd0 Removed not needed escaping. 2023-03-01 09:23:27 +00:00
Achim D. Brucker 684e1144bd Merge branch 'main' into Isabelle_dev 2023-03-01 09:20:23 +00:00
Achim D. Brucker 3a39028f1c Added CENELEC_50128_Documentation.thy to session build. 2023-03-01 09:16:48 +00:00
Achim D. Brucker ae514aea18 Print theories that are not part of session as part of the CI build. 2023-03-01 08:49:56 +00:00
Achim D. Brucker 9f5473505e Updated authorarchive. 2023-03-01 06:32:23 +00:00
Achim D. Brucker 0c732ec59f Merge branch 'main' into Isabelle_dev 2023-02-28 21:55:49 +00:00
Achim D. Brucker f27150eb88 Updated options to mark the use of the development version of Isabelle. 2023-02-28 08:34:29 +00:00
Achim D. Brucker bde86a1118 Added note on using the development version of Isabelle. 2023-02-28 08:30:56 +00:00
Achim D. Brucker be2eaab09b Merge branch 'main' into Isabelle_dev 2023-02-28 05:20:29 +00:00
Achim D. Brucker 058324ab5d Further updates to the new project structure (contributes to #23). 2023-02-28 05:20:01 +00:00
Achim D. Brucker 10b4eaf660 Fixed shebang. 2023-02-28 01:02:23 +00:00
Achim D. Brucker c59858930d Updated installation instructions and project setup for AFP (non Isabelle component) version of Isabelle/DOF (contributes to #23). 2023-02-28 00:55:23 +00:00
Achim D. Brucker 7ad7c664a3 Started to update documentation to match new repository layout (contributes to #23). 2023-02-28 00:50:23 +00:00
Achim D. Brucker dd963a7e09 Re-activated build of release archive (fixed #27). 2023-02-27 15:35:52 +00:00
Achim D. Brucker 5f88def3be Fixed list_ontologies. 2023-02-27 13:34:31 +00:00
Achim D. Brucker dfcd00ca73 Merge branch 'main' into Isabelle_dev 2023-02-27 12:41:18 +00:00
Achim D. Brucker e26b4e662e Added description to ontology representations and document templates. 2023-02-27 12:24:23 +00:00
Achim D. Brucker 02332e8608 Re-activiated test for dof_mkroot. 2023-02-27 09:05:34 +00:00
Achim D. Brucker 86152c374b Initial implementation of list_templates and list_ontologies (fixes #28). 2023-02-27 08:39:53 +00:00
Achim D. Brucker 233079ef5f Fixed scala build. 2023-02-26 21:55:29 +00:00
Achim D. Brucker 8389d9ddbe Merge branch 'main' into Isabelle_dev 2023-02-26 21:29:27 +00:00
Achim D. Brucker 85e6cd0372 Re-introduced dof_mkroot for main component and moved component setup to main directory (fixes #20). 2023-02-26 21:18:40 +00:00
Achim D. Brucker 9090772a8a Cleanup. 2023-02-26 11:00:57 +00:00
Achim D. Brucker 070bd363ca Merge branch 'main' into Isabelle_dev 2023-02-25 11:08:17 +00:00
Achim D. Brucker 8e65263093 Ignore generated latex-outputs in test session. 2023-02-25 11:01:58 +00:00
Achim D. Brucker acb82477b5 Moved currently unsupported document templates to the Isabelle_DOF-Ontologies session. 2023-02-25 11:01:39 +00:00
Achim D. Brucker b90992121e Updated README to reflect latest repository layout. 2023-02-25 10:28:51 +00:00
Nicolas Méric 6a6259bf29 Add very deep interpretation
Use metalogic to generate meta term anti-quotations

The idea is for the Very_Deep_Interpretation
to source the shallow material,
and then update the checking and elaboration functions
of the term anti-quotations.
To achieve this, the mechanism of removing and reading the notations
(mixfixes) of the term-antiquotations, after the metalogic
is sourced, is used.

Example:

With shallow:

datatype "typ" = Isabelle_DOF_typ string  ("@{typ _}")

Generate a datatype whose Constructor Isabelle_DOF_typ has
the notation @{typ ...}.

You get:

find_consts name:"Isabelle_DOF_typ"

find_consts
  name: "Isabelle_DOF_typ"

found 1 constant(s):
  Shallow_Interpretation.typ.Isabelle_DOF_typ :: "char list ⇒ typ"

With Deep:

no_notation "Isabelle_DOF_typ" ("@{typ _}")

consts Isabelle_DOF_typ :: "string ⇒ typ" ("@{typ _}")

The notation is removed and then added to the new Isabelle_DOF_typ constant.

You get:

find_consts name:"Isabelle_DOF_typ"

find_consts
  name: "Isabelle_DOF_typ"

found 2 constant(s):
  Deep_Interpretation.Isabelle_DOF_typ :: "char list ⇒ Core.typ"
  Shallow_Interpretation.typ.Isabelle_DOF_typ :: "char list ⇒ Shallow_Interpretation.typ"

But only the Deep_Interpretation constant has the notation (mixfix).

Then new interpretation of term anti-quotations is available
for the user.
2023-02-24 10:44:47 +01:00
Achim D. Brucker fb049946c5 Fixed import. 2023-02-24 09:20:57 +00:00
Achim D. Brucker 829915ae2c Merge branch 'main' into Isabelle_dev 2023-02-22 22:58:47 +00:00
Achim D. Brucker 85f115196b Changed theory dependencies, allowing retirement of use_ontology_unchecked (fixes #25). 2023-02-22 22:46:25 +00:00
Achim D. Brucker 873f5c79ab API update to match development version of Isabelle. 2023-02-22 11:05:05 +00:00
Achim D. Brucker 55f377da39 Merge branch 'main' into Isabelle_dev 2023-02-22 10:33:38 +00:00
Achim D. Brucker 501ea118c2 Removed quick_and_dirty mode. 2023-02-22 10:13:27 +00:00
Achim D. Brucker a055180b72 Added PDF document generation (Fixes: #22). 2023-02-22 09:52:05 +00:00
Achim D. Brucker d1c195db26 Cleanup. 2023-02-22 07:20:30 +00:00
Achim D. Brucker 2481603ce1 Temporarily disabled release creation. 2023-02-22 06:52:12 +00:00
Achim D. Brucker b9eeb9e9b8 Temporarily disabled release creation. 2023-02-22 06:30:47 +00:00
Achim D. Brucker fa27d2425e Retired dof_mkroot. 2023-02-21 23:03:12 +00:00
Achim D. Brucker 8b9c65f6ef Merge branch 'main' into Isabelle_dev 2023-02-21 22:45:07 +00:00
Achim D. Brucker f66b6187f8 Introduced use_ontology_unchecked (for internal use only). 2023-02-21 22:34:30 +00:00
Achim D. Brucker cf386892fc Implemented support for using full-qualfied names for ontologies, allowing for user-defined ontology styles in custom sessions. 2023-02-21 21:32:23 +00:00
Achim D. Brucker b0879e98fd Merge branch 'main' into Isabelle_dev 2023-02-21 08:34:41 +00:00
Achim D. Brucker f8399e0fb2 Exclude proof session from default build. 2023-02-21 08:30:07 +00:00
Achim D. Brucker 0c064b1c8a Update. 2023-02-21 08:30:02 +00:00
Achim D. Brucker 1e0eeea6f9 Update. 2023-02-21 08:18:05 +00:00
Achim D. Brucker 080d867587 Exclude proof session from default build. 2023-02-21 08:17:18 +00:00
Achim D. Brucker 3e41871b17 Added bib file. 2023-02-21 08:11:35 +00:00
Achim D. Brucker be9ef5a122 Update. 2023-02-21 08:01:43 +00:00
Achim D. Brucker f0fac41148 Merge branch 'main' into Isabelle_dev 2023-02-21 07:57:33 +00:00
Achim D. Brucker 47fa3590aa Moved CENELEC ontology (and its LaTeX style) to the session Isabelle_DOF-Ontologies. 2023-02-20 23:34:54 +00:00
Achim D. Brucker fba9ca78e9 Restructured examples. 2023-02-19 22:40:11 +00:00
Achim D. Brucker 9287891483 Merge branch 'main' into Isabelle_dev 2023-02-19 22:32:24 +00:00
Achim D. Brucker 30eb47d80c Fixed section structure. 2023-02-19 22:26:18 +00:00
Achim D. Brucker 00eff9f819 Initial document setup. 2023-02-19 22:15:37 +00:00
Achim D. Brucker 73e3cb1098 Marked session as AFP candidate. 2023-02-19 20:57:06 +00:00
Achim D. Brucker 64f4957679 Merge branch 'main' into Isabelle_dev 2023-02-19 20:53:19 +00:00
Achim D. Brucker e4a8ad4227 Exclude proof session from default build. 2023-02-19 20:51:28 +00:00
Achim D. Brucker 60b1c4f4d4 Update to Isabelle devleopment version. 2023-02-19 20:08:18 +00:00
Achim D. Brucker de1870fbee Update to Isabelle devleopment version. 2023-02-19 20:01:01 +00:00
Achim D. Brucker f7b4cf67f7 Cleanup. 2023-02-19 18:48:37 +00:00
Achim D. Brucker 97bf5aa1e3 Fine tuning. 2023-02-19 18:20:26 +00:00
Achim D. Brucker d766ac22df Initial commit. 2023-02-19 18:12:14 +00:00
Achim D. Brucker ba90433700 Removed links to files outside of the current session. 2023-02-19 17:46:16 +00:00
Achim D. Brucker 762225d20d Added check for file references to different sessions. 2023-02-19 17:28:47 +00:00
Achim D. Brucker aaeb793a51 Moved ontologies into session Isabelle_DOF-Ontologies. 2023-02-19 16:41:16 +00:00
Achim D. Brucker 38628c37dc Integrated manual into Isabelle/DOF session. 2023-02-19 15:49:07 +00:00
Achim D. Brucker 43ccaf43f7 Refactoring of session setup. 2023-02-19 13:06:00 +00:00
Nicolas Méric 848ce311e2 Re-add name field to onto_class
To keep the abstract syntax information
of the onto_class name, re-add it to the field
of the onto_class structure
2023-02-17 12:56:45 +01:00
Nicolas Méric 6115f0de4a Some cleanup 2023-02-17 11:35:51 +01:00
Nicolas Méric bdfea3ddb1 Some cleanup 2023-02-17 09:08:34 +01:00
Nicolas Méric 9de18b148a Remove some instance and onto_class datatypes entries
Id in instance datatype entry
and name, id and thy_name  in onto_class datatype entry are now
useless, as this information is given by the name space.
Remove them
2023-02-16 10:41:04 +01:00
Nicolas Méric 1459b8cfc3 Use name space markup for onto_class entries reporting 2023-02-16 10:07:56 +01:00
Nicolas Méric 234ff18ec0 Use a name space for Onto Classes
- Use a name space table to store ontological class objects
- Remove docclass_tab table and accesses
2023-02-15 17:49:29 +01:00
Nicolas Méric 55690bba33 Homogenize instance getters names 2023-02-14 09:21:11 +01:00
Nicolas Méric 93509ab17d Update file to match the new name space implementation 2023-02-14 09:20:21 +01:00
Nicolas Méric 1e09598d81 Fix typo 2023-02-14 09:20:21 +01:00
Nicolas Méric e01ec9fc21 Use a name space for ML invariants
- Use a name space table to store ML inariants objects
- Remove docclass_inv_tab, docclass_eager_inv_tab,
  and docclass_lazy_inv_tab tables and accesses
2023-02-14 09:20:13 +01:00
Nicolas Méric 7c16d02979 Use a name space for Isabelle_DOF transformers
- Use a name space table to store Isabelle_DOF transformers objects
- Remove ISA_transformer_tab table and accesses
2023-02-12 16:49:53 +01:00
Nicolas Méric 4a77347e40 Simplify reporting of monitors 2023-02-12 11:20:13 +01:00
Nicolas Méric 2398fc579a Use name space markup for instances entries reporting
- Name spaces offer the possibility to make reporting
  by embedding entries position. Use this possibility
  for instances (docitems) reporting
- Position and theory entries in an Instance record are now
  useless, as this information is given by the name space.
  Remove them
2023-02-11 22:48:11 +01:00
Nicolas Méric 821eefb230 Fix some markups 2023-02-10 15:23:23 +01:00
Nicolas Méric 9b51844fad Use a name space for monitors infos
- Use a name space table to store monitor infos objects
- Remove monitor_tab table, as monitor infos were moved
  to the name space table
- It offers the possibility to define scoped versions
  of monitors
2023-02-10 13:07:17 +01:00
Nicolas Méric c440f9628f Fix typo 2023-02-09 16:40:05 +01:00
Nicolas Méric 5b3086bbe5 Use a name space for docitems (instances)
- Use a name space table to store docitem (instance) objects
- Remove docobj table, as instances were moved to the name space table
- It offers the possibility to define scoped versions
  of docitems declaration
  for text* (and others docitems definition command like value*)
  and declare_reference*.
2023-02-09 16:07:16 +01:00
Nicolas Méric 7c0d2cee55 Add docitem_name text and ML antiquotations
Add the possibility to reference the name of instances
in text and ML code
2023-01-30 07:43:44 +01:00
Nicolas Méric 7c6150affa Make input_term available with theory option
The raw value term of docitems is now processed and
available when setting the theory attribute object_value_debug
2023-01-27 15:09:34 +01:00
Nicolas Méric ad4ad52b4e Avoid reporting duplication when possible
Avoid reporting for meta arguments attributes of isabelle_DOF
commands and for text input of text*

The last reporting duplication not resolved comes
from the document_command command in Isa_DOF,
which parses the meta arguments twice,
one time for the creation of the docitem
with create_and_check_docitem which will add reporting
for the attributes value
(see conv_attrs whichs calls Syntax.read_term_global,
which iwill add reporting)
and the other for the document output
with document_output which also adds reporting
(see meta_args_2_latex which calls
(Syntax.check_term ctxt o Syntax.parse_term ctxt) with
ltx_of_markup and Syntax.parse_term also adds reporting)
2023-01-27 10:32:38 +01:00
Nicolas Méric ba8227e6ab Cleanup and add position to docitem ML antiqutation 2023-01-26 09:43:51 +01:00
Nicolas Méric 20b0af740d Update meta args syntax and ML* command
- Make optional meta arguments completely optional
- Make meta arguments context of ML* available in its ML context
- Make meta arguments of ML* mandatory to mimic text*.
  Without meta arguments, its behavior is already captured by
  the ML command
2023-01-23 09:03:59 +01:00
Nicolas Méric 1379f8a671 Add test of invariants of an inherited attribute of an attribute 2023-01-20 09:41:19 +01:00
Achim D. Brucker 8fdaafa295 Experimental session with enabled proof objects: Isabelle_DOF-Proofs. 2023-01-19 22:00:53 +00:00
Nicolas Méric 8513f7d267 Update doc_class rails to match accepts clause 2023-01-17 09:01:55 +01:00
Nicolas Méric 2b1a9d009e Add support invariants on attributes of attributes
Support invariants on attributes of classes atttributes.

Example:

doc_class inv_test1 =
  a :: int

doc_class inv_test2 =
  b :: "inv_test1"
  c:: int
  invariant inv_test2 :: "c σ = 1"
  invariant inv_test2' :: "a (b σ) = 2"

doc_class inv_test3 = inv_test1 +
  b :: "inv_test1"
  c:: int
  invariant inv_test3 :: "a σ = 1"
  invariant inv_test3' :: "a (b σ) = 2"

To support invariant on attributes in attributes
and invariant on attributes of the superclasses,
we check that the type of the attribute of the subclass is ground:›
ML‹
val Type(st, [ty]) = \<^typ>‹inv_test1›
val Type(st', [ty']) = \<^typ>‹'a inv_test1_scheme›
val t = ty = \<^typ>‹unit›
›
2023-01-13 08:27:26 +01:00
Nicolas Méric cd758d2c44 Update accepts clause syntax 2023-01-12 12:18:58 +01:00
Nicolas Méric 8496963fec Add comment for term_ and value_ ML antiquoatations 2023-01-11 14:49:29 +01:00
Nicolas Méric 72d8000f7b Further explain evaluator option syntax for value_ text antiquotation 2023-01-09 15:34:59 +01:00
Nicolas Méric 17ec11b297 Explain evaluator option syntax for value_ text antiquotation 2023-01-09 15:13:23 +01:00
Nicolas Méric a96e17abf3 Add term_ and value_ ML antiquotations 2023-01-09 11:34:40 +01:00
Nicolas Méric 74b60e47d5 Document term _ and value_ text antiquotations 2022-12-22 16:50:53 +01:00
Nicolas Méric a42dd4ea6c Implement term _ and value_ text antiquotations 2022-12-22 10:55:03 +01:00
Nicolas Méric b162a24749 Comment out hack for Assumption in scholarly_paper 2022-12-22 09:55:46 +01:00
Nicolas Méric a9432c7b52 Add a theory attribute to disable invariants checking 2022-12-22 07:53:42 +01:00
Nicolas Méric 9f28d4949e Limit scope of free class checking in examples 2022-12-22 07:32:37 +01:00
Nicolas Méric 885c23a138 Explain lazy and eager invariants 2022-12-22 07:14:29 +01:00
Nicolas Méric a589d4cd47 Update the position of the default class
The default class must stay abtract and as such
can not have a position.
Set its position to Position.none
2022-12-21 18:32:07 +01:00
Burkhart Wolff e1f143d151 Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2022-12-21 11:35:05 +01:00
Burkhart Wolff fd60cf2312 attempt to add category 'assumption' 2022-12-21 11:34:34 +01:00
Nicolas Méric 73dfcd6c1e Implement rejects clause
- The current implementation triggers a warning when
  rejected classes are find in the monitor,
  and an error if monitor_strict_checking is enable.
  It follows these rules:
  Inside the scope of a monitor,
  all instances of classes mentioned in its accepts_clause
  (the ∗‹accept-set›) have to appear in the order specified
  by the regular expression.
  Instances not covered by an accept-set may freely occur.
  Monitors may additionally contain a rejects_clause
  with a list of class-ids (the reject-list).
  This allows specifying ranges of
  admissible instances along the class hierarchy:
  - a superclass in the reject-list and a subclass in the
    accept-expression forbids instances superior to the subclass, and
  - a subclass S in the reject-list and a superclass T in the
    accept-list allows instances of superclasses of T to occur freely,
    instances of T to occur in the specified order and forbids
    instances of S.
- No message is triggered for the free classes,
  but two theory options, free_class_in_monitor_checking
  and free_class_in_monitor_strict_checking,
  are added and can be used if we want to trigger warnings or errors,
  in the case we do not want free classes inside a monitor.
- Fix the checking warning when defining a monitor,
  as the monitor was added to the monitor table and then
  the instance of the monitor was added to the theory.
  So a monitor had the bad behavior to check itself.
2022-12-21 10:09:17 +01:00
Nicolas Méric c0afe1105e Enable high-level invariants checking everywhere
By default invariants checking generates warnings.
If invariants_strict_checking theory option is enabled,
the checking generates errors.

- Update 2018-cicm-isabelle_dof-applications/IsaDofApplications.thy
  and 2020-iFM-CSP/paper.thy to pass the checking of
  the low level invariant checking function "check"
  in scholarly_paper.thy,
  which checks that the instances in a sequence of the same class
  have a growing level.
  For a sequence:
  section*[intro::introduction]‹ Introduction ›
  text*[introtext::introduction, level = "Some 1"]‹...›

  introtext must have a level >= than intro.

- Bypass the checking of high-level invariants
  when the class default_cid = "text",
  the top (default) document class.
  We want the class default_cid to stay abstract
  and not have the capability to be defined with attribute,
  invariants, etc.
  Hence this bypass handles docitem without a class associated,
  for example when you just want a document element to be referenceable
  without using the burden of ontology classes.
  ex: text*[sdf]\<open> Lorem ipsum @{thm refl}\<close>

  The functions get_doc_class_global and get_doc_class_local trigger
  an error when the class is "text" (default_cid),
  then the functions like check_invariants which use it will fail
  if the checking is enabled by default for all the theories.
2022-12-20 16:31:09 +01:00
Burkhart Wolff e414b97afb rephrasing invariant for core scholarly_paper classes 2022-12-19 12:14:30 +01:00
Nicolas Méric 0b2d28b547 Update error message for invariant checking 2022-12-09 16:11:57 +01:00
Nicolas Méric 37d7ed7d17 Update rails for annotated text element in manual 2022-12-09 15:13:22 +01:00
Nicolas Méric 312734afbd Update Attributes examples 2022-12-09 15:12:38 +01:00
Burkhart Wolff 8cee80d78e advanced example on trace-attribute term-antiquotations 2022-12-07 16:01:38 +01:00
Makarius Wenzel ec0d525426 Tuned messages, following Isabelle/d6a2a8bc40e1 2022-12-05 15:21:26 +01:00
Makarius Wenzel 791990039b Tuned messages and options, following Isabelle/c7f3e94fce7b 2022-12-05 12:37:59 +01:00
Makarius Wenzel 78d61390fe Prefer Isar command, instead of its underlying ML implementation 2022-12-05 11:50:12 +01:00
Makarius Wenzel ffcf1f3240 Add missing file (amending 5471d873a9) 2022-12-04 19:26:28 +01:00
Makarius Wenzel 5471d873a9 Isabelle/Scala module within session context supports document_build = "dof" without component setup 2022-12-04 19:13:08 +01:00
Makarius Wenzel df37250a00 Simplified args, following README.md 2022-12-04 19:00:23 +01:00
Makarius Wenzel 185daeb577 Tuned 2022-12-04 18:25:29 +01:00
Makarius Wenzel 8037fd15f2 Tuned messages, following isabelle.Export.message 2022-12-04 18:20:54 +01:00
Makarius Wenzel afcd78610b More concise export artifact 2022-12-04 18:03:53 +01:00
Makarius Wenzel b8a9ef5118 Tuned comments 2022-12-04 16:38:56 +01:00
Makarius Wenzel a4e75c8b12 Clarified export name for the sake of low-level errors 2022-12-04 16:35:55 +01:00
Makarius Wenzel d20e9ccd22 Proper session qualifier for theory imports (amending 44cae2e631) 2022-12-04 00:45:07 +01:00
Makarius Wenzel f2ee5d3780 Tuned 2022-12-04 00:10:43 +01:00
Makarius Wenzel 44cae2e631 More formal management of ontologies in Isabelle/ML/Isar with output via Isabelle/Scala exports 2022-12-04 00:09:29 +01:00
Makarius Wenzel 7b2bf35353 More strict treatment of document export artifacts 2022-12-03 14:54:14 +01:00
Makarius Wenzel e8c7fa6018 Clarified signature 2022-12-03 14:44:04 +01:00
Makarius Wenzel b12e61511d Discourage etc/options 2022-12-03 13:55:56 +01:00
Makarius Wenzel 3cac42e6cb Clarified order 2022-12-03 12:39:00 +01:00
Makarius Wenzel aee8ba1df1 Prefer DOF parameters over Isabelle options 2022-12-03 12:37:58 +01:00
Makarius Wenzel d93e1383d4 Afford full-scale command-line tool 2022-12-03 12:29:24 +01:00
Makarius Wenzel 3d5d1e7476 Further attempts at woodpecker environment 2022-12-02 22:54:02 +01:00
Makarius Wenzel 4264e7cd15 Build Scala/Java components to get proper ISABELLE_CLASSPATH 2022-12-02 21:40:59 +01:00
Makarius Wenzel 96f4077c53 Tuned message 2022-12-02 21:29:45 +01:00
Makarius Wenzel d7fb39d7eb Adhoc command-line tool replaces old options 2022-12-02 21:14:55 +01:00
Makarius Wenzel b95826962f Tuned documentation 2022-12-02 20:29:40 +01:00
Makarius Wenzel 912d4bb49e Maintain document template in Isabelle/ML via Isar commands:
result becomes export artifact, which is harvested by Isabelle/Scala build engine
2022-12-02 20:05:15 +01:00
Makarius Wenzel a6c1a2baa4 Removed obsolete "extend" operation 2022-12-02 15:31:23 +01:00
Makarius Wenzel bb5963c6e2 Proper usage of dof_mkroot, although its Bash pretty-printing in LaTeX is a bit odd 2022-12-02 14:35:17 +01:00
Makarius Wenzel cc3e2a51a4 More antiquotations 2022-12-02 13:50:16 +01:00
Makarius Wenzel 9e4e5b49eb More antiquotations from Isabelle2021-1/2022 2022-12-02 11:41:31 +01:00
Makarius Wenzel b65ecbdbef Updated to Isabelle2022 2022-12-02 10:34:15 +01:00
Makarius Wenzel 3be2225dcf Tuned comments 2022-12-01 22:54:01 +01:00
Makarius Wenzel f44f0af01c Use regular Toplevel.presentation from Isabelle2022, without alternative presentation hook 2022-12-01 22:48:45 +01:00
Makarius Wenzel 9a11baf840 Latex.output_name name is back in Isabelle2022 2022-12-01 22:04:56 +01:00
Makarius Wenzel 48c167aa23 Proper DOF.artifact_url 2022-12-01 21:45:06 +01:00
Makarius Wenzel 700a9bbfee clarified DOF.options: hard-wired document_comment_latex always uses LaTeX version of comment.sty 2022-12-01 21:30:32 +01:00
Makarius Wenzel 73299941ad Tuned 2022-12-01 17:26:29 +01:00
Makarius Wenzel 5a8c438c41 Omit excessive quotes 2022-12-01 16:48:33 +01:00
Makarius Wenzel 7772c73aaa More accurate defaults 2022-12-01 16:39:41 +01:00
Makarius Wenzel ca18453043 Clarified signature: more explicit types and operations 2022-12-01 16:28:44 +01:00
Makarius Wenzel 1a122b1a87 More robust default 2022-12-01 15:48:52 +01:00
Makarius Wenzel 47d95c467e Tuned whitespace 2022-12-01 15:33:16 +01:00
Makarius Wenzel bf3085d4c0 Clairifed defaults and command-line options 2022-12-01 15:26:48 +01:00
Makarius Wenzel 068e6e0411 Tuned 2022-12-01 14:23:00 +01:00
Makarius Wenzel 09e9980691 Tuned 2022-12-01 14:22:32 +01:00
Makarius Wenzel 94ce3fdec2 Prefer constants in Scala, to make this independent from component context 2022-12-01 14:15:17 +01:00
Makarius Wenzel 44819bff02 Updated message, following c29ec9641a 2022-12-01 12:44:03 +01:00
Makarius Wenzel a6ab1e101e Update Isabelle + AFP URLs 2022-12-01 11:55:51 +01:00
Makarius Wenzel c29ec9641a Simplified installation 2022-12-01 11:45:12 +01:00
Nicolas Méric 06833aa190 Upddate single argument handling for compute_attr_access
Trigger error when the attribute is not specified as an argument
of the antiquatation and is not an attribujte of the instance.
(In these case, the position of the attribute is NONE)
2022-11-28 10:05:47 +01:00
Nicolas Méric 4f0c7e1e95 Fix type unification clash for trace_attribute term antiquotation 2022-11-25 08:57:59 +01:00
Nicolas Méric 0040949cf8 Add trace-attribute term antiquotation
- Make doc_class type and constant used by regular expression
  in monitors ground
- Make class tag attribute ground (with serial())
- The previous items make possible
  the evaluation of the trace attribute
  and the definition of the trace-attribute term annotation
2022-11-24 16:47:21 +01:00
Nicolas Méric e68c332912 Fix markup for some antiquotations
Fix markup for docitem_attribute and trace_attribute
ML and text antiquotations
2022-11-24 11:22:02 +01:00
Burkhart Wolff b2c4f40161 Some LaTeX experiments with Achim 2022-11-18 10:30:33 +01:00
Burkhart Wolff 309952e0ce syntactic rearrangements 2022-11-09 11:19:00 +01:00
Burkhart Wolff 830e1b440a ported another Figure* in OutOfOrderPresntn to Isabelle2022 2022-11-09 06:06:30 +01:00
Burkhart Wolff 2149db9efc semantics of fig_content (untested) 2022-11-08 20:52:58 +01:00
Burkhart Wolff 1547ace64b added some semantics to fig_content 2022-11-08 19:27:07 +01:00
Burkhart Wolff 39acd61dfd Merge branch 'main' of https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF 2022-11-08 10:03:30 +01:00
Burkhart Wolff 29770b17ee added syntax for fig_content 2022-11-08 10:03:15 +01:00
Achim D. Brucker b4f4048cff Made clear that more than two AFP entries are required. 2022-11-07 17:05:04 +00:00
216 changed files with 15583 additions and 8438 deletions

1
.gitignore vendored
View File

@ -2,3 +2,4 @@ output
.afp
*~
*#
Isabelle_DOF-Unit-Tests/latex_test/

View File

@ -1,22 +1,29 @@
pipeline:
build:
image: docker.io/logicalhacking/isabelle2022
image: git.logicalhacking.com/lh-docker/lh-docker-isabelle/isabelle2023:latest
pull: true
commands:
- hg log --limit 2 /root/isabelle
- ./.woodpecker/check_dangling_theories
- ./.woodpecker/check_external_file_refs
- ./.woodpecker/check_quick_and_dirty
- export ARTIFACT_DIR=$CI_WORKSPACE/.artifacts/$CI_REPO/$CI_BRANCH/$CI_BUILD_NUMBER/$LATEX
- mkdir -p $ARTIFACT_DIR
- export `isabelle getenv ISABELLE_HOME_USER`
- mkdir -p $ISABELLE_HOME_USER/etc
- echo "ISABELLE_PDFLATEX=\"$LATEX --file-line-error\"" >> $ISABELLE_HOME_USER/etc/settings
- isabelle components -u `pwd`
- isabelle build -D . -o browser_info
- isabelle dof_mkroot DOF_test
- isabelle build -x HOL-Proofs -x Isabelle_DOF-Proofs -D . -o browser_info
- if [ "$LATEX" = "lualatex" ]; then isabelle build -o 'timeout_scale=2' -D . -o browser_info; else echo "Skipping Isabelle_DOF-Proofs for pdflatex build."; fi
- find . -name 'root.tex' -prune -o -name 'output' -type f | xargs latexmk -$LATEX -cd -quiet -Werror
- isabelle components -u .
- isabelle dof_mkroot -q DOF_test
- isabelle build -D DOF_test
- cp -r $ISABELLE_HOME_USER/browser_info $ARTIFACT_DIR
- cd $ARTIFACT_DIR
- cd ../..
- ln -s * latest
archive:
image: docker.io/logicalhacking/isabelle2022
image: git.logicalhacking.com/lh-docker/lh-docker-isabelle/isabelle2023:latest
commands:
- export ARTIFACT_DIR=$CI_WORKSPACE/.artifacts/$CI_REPO/$CI_BRANCH/$CI_BUILD_NUMBER/$LATEX
- mkdir -p $ARTIFACT_DIR
@ -38,7 +45,7 @@ pipeline:
from_secret: artifacts_ssh
user: artifacts
notify:
image: drillster/drone-email
image: docker.io/drillster/drone-email
settings:
host: smtp.0x5f.org
username: woodpecker

View File

@ -0,0 +1,33 @@
#!/bin/bash
set -e
failuremsg="Error"
failurecode=1
while [ $# -gt 0 ]
do
case "$1" in
--warning|-w)
failuremsg="Warning"
failurecode=0;;
esac
shift
done
echo "Checking for theories that are not part of an Isabelle session:"
echo "==============================================================="
PWD=`pwd`
TMPDIR=`mktemp -d`
isabelle build -D . -l -n | grep $PWD | sed -e "s| *${PWD}/||" | sort -u | grep thy$ > ${TMPDIR}/sessions-thy-files.txt
find * -type f | sort -u | grep thy$ > ${TMPDIR}/actual-thy-files.txt
thylist=`comm -13 ${TMPDIR}/sessions-thy-files.txt ${TMPDIR}/actual-thy-files.txt`
if [ -z "$thylist" ] ; then
echo " * Success: No dangling theories found."
exit 0
else
echo -e "$thylist"
echo "$failuremsg: Dangling theories found (see list above)!"
exit $failurecode
fi

View File

@ -0,0 +1,45 @@
#!/bin/sh
failuremsg="Error"
failurecode=1
while [ $# -gt 0 ]
do
case "$1" in
--warning|-w)
failuremsg="Warning"
failurecode=0;;
esac
shift
done
DIRREGEXP="\\.\\./"
echo "Checking for references pointing outside of session directory:"
echo "=============================================================="
REGEXP=$DIRREGEXP
DIR=$DIRMATCH
failed=0
for i in $(seq 1 10); do
FILES=`find * -mindepth $((i-1)) -maxdepth $i -type f | xargs`
if [ -n "$FILES" ]; then
grep -s ${REGEXP} ${FILES}
exit=$?
if [ "$exit" -eq 0 ] ; then
failed=1
fi
fi
REGEXP="${DIRREGEXP}${REGEXP}"
done
if [ "$failed" -ne 0 ] ; then
echo "$failuremsg: Forbidden reference to files outside of their session directory!"
exit $failurecode
fi
echo " * Success: No relative references to files outside of their session directory found."
exit 0

View File

@ -0,0 +1,30 @@
#!/bin/bash
set -e
failuremsg="Error"
failurecode=1
while [ $# -gt 0 ]
do
case "$1" in
--warning|-w)
failuremsg="Warning"
failurecode=0;;
esac
shift
done
echo "Checking for sessions with quick_and_dirty mode enabled:"
echo "========================================================"
rootlist=`find -name 'ROOT' -exec grep -l 'quick_and_dirty *= *true' {} \;`
if [ -z "$rootlist" ] ; then
echo " * Success: No sessions with quick_and_dirty mode enabled found."
exit 0
else
echo -e "$rootlist"
echo "$failuremsg: Sessions with quick_and_dirty mode enabled found (see list above)!"
exit $failurecode
fi

View File

@ -83,22 +83,22 @@ build_and_install_manuals()
if [ "$DIRTY" = "true" ]; then
if [ -z ${ARTIFACT_DIR+x} ]; then
echo " * Quick and Dirty Mode (local build)"
$ISABELLE build -d . Isabelle_DOF-Manual 2018-cicm-isabelle_dof-applications
mkdir -p $ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/
cp examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/document.pdf \
$ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/
mkdir -p $ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/
cp examples/technical_report/Isabelle_DOF-Manual/output/document.pdf \
$ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/;
$ISABELLE build -d . Isabelle_DOF Isabelle_DOF-Example-I
mkdir -p $ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/
cp Isabelle_DOF-Example-I/output/document.pdf \
$ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/
mkdir -p $ISADOF_WORK_DIR/Isabelle_DOF/output/
cp Isabelle_DOF/output/document.pdf \
$ISADOF_WORK_DIR/Isabelle_DOF/output/;
else
echo " * Quick and Dirty Mode (running on CI)"
mkdir -p $ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/
cp $ARTIFACT_DIR/browser_info/Unsorted/2018-cicm-isabelle_dof-applications/document.pdf \
$ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/
mkdir -p $ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/
cp $ARTIFACT_DIR/browser_info/AFP/Isabelle_DOF-Example-I/document.pdf \
$ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/
mkdir -p $ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/
cp $ARTIFACT_DIR/browser_info/Unsorted/Isabelle_DOF-Manual/document.pdf \
$ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/;
mkdir -p $ISADOF_WORK_DIR/Isabelle_DOF/output/
cp $ARTIFACT_DIR/browser_info/AFP/Isabelle_DOF/document.pdf \
$ISADOF_WORK_DIR/Isabelle_DOF/output/;
fi
else
(cd $ISADOF_WORK_DIR && $ISABELLE env ./install-afp)
@ -107,13 +107,13 @@ build_and_install_manuals()
mkdir -p $ISADOF_WORK_DIR/doc
echo "Isabelle/DOF Manuals!" > $ISADOF_WORK_DIR/doc/Contents
cp $ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/document.pdf \
cp $ISADOF_WORK_DIR/Isabelle_DOF/output/document.pdf \
$ISADOF_WORK_DIR/doc/Isabelle_DOF-Manual.pdf
echo " Isabelle_DOF-Manual User and Implementation Manual for Isabelle/DOF" >> $ISADOF_WORK_DIR/doc/Contents
cp $ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/document.pdf \
$ISADOF_WORK_DIR/doc/2018-cicm-isabelle_dof-applications.pdf
echo " 2018-cicm-isabelle_dof-applications Example academic paper" >> $ISADOF_WORK_DIR/doc/Contents
cp $ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/document.pdf \
$ISADOF_WORK_DIR/doc/Isabelle_DOF-Example-I.pdf
echo " Isabelle_DOF-Example-I Example academic paper" >> $ISADOF_WORK_DIR/doc/Contents
find $ISADOF_WORK_DIR -type d -name "output" -exec rm -rf {} \; &> /dev/null || true
rm -rf $ISADOF_WORK_DIR/.git* $ISADOF_WORK_DIR/.woodpecker $ISADOF_WORK_DIR/.afp
@ -143,7 +143,6 @@ publish_archive()
ssh 0x5f.org chmod go+u-w -R www/$DOF_ARTIFACT_HOST/htdocs/$DOF_ARTIFACT_DIR
}
ISABELLE=`which isabelle`
USE_TAG="false"
SIGN="false"
@ -194,8 +193,8 @@ for i in $VARS; do
export "$i"
done
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL options -g dof_isabelle)"
DOF_VERSION="$($ISABELLE_TOOL options -g dof_version)"
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
DOF_VERSION="$($ISABELLE_TOOL dof_param -b dof_version)"
ISABELLE_SHORT_VERSION=`echo $ISABELLE_VERSION | sed -e 's/:.*$//'`
ISADOF_TAR="Isabelle_DOF-"$DOF_VERSION"_"$ISABELLE_SHORT_VERSION
@ -221,4 +220,3 @@ fi
rm -rf $BUILD_DIR
exit 0

View File

@ -11,7 +11,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
### Changed
- Updated Isabelle version to Isabelle 2022
- Updated Isabelle version to Isabelle 2023
## [1.3.0] - 2022-07-08

View File

@ -16,6 +16,9 @@ theory IsaDofApplications
imports "Isabelle_DOF.scholarly_paper"
begin
use_template "lncs"
use_ontology "Isabelle_DOF.scholarly_paper"
open_monitor*[this::article]
declare[[strict_monitor_checking=false]]
@ -27,6 +30,61 @@ define_shortcut* isadof \<rightleftharpoons> \<open>\isadof\<close>
(* slanted text in contrast to italics *)
define_macro* slanted_text \<rightleftharpoons> \<open>\textsl{\<close> _ \<open>}\<close>
define_macro* unchecked_label \<rightleftharpoons> \<open>\autoref{\<close> _ \<open>}\<close>
ML\<open>
fun boxed_text_antiquotation name (* redefined in these more abstract terms *) =
DOF_lib.gen_text_antiquotation name DOF_lib.report_text
(fn ctxt => DOF_lib.string_2_text_antiquotation ctxt
#> DOF_lib.enclose_env false ctxt "isarbox")
val neant = K(Latex.text("",\<^here>))
fun boxed_theory_text_antiquotation name (* redefined in these more abstract terms *) =
DOF_lib.gen_text_antiquotation name DOF_lib.report_theory_text
(fn ctxt => DOF_lib.string_2_theory_text_antiquotation ctxt
#> DOF_lib.enclose_env false ctxt "isarbox"
(* #> neant *)) (*debugging *)
fun boxed_sml_text_antiquotation name =
DOF_lib.gen_text_antiquotation name (K(K()))
(fn ctxt => Input.source_content
#> Latex.text
#> DOF_lib.enclose_env true ctxt "sml")
(* the simplest conversion possible *)
fun boxed_pdf_antiquotation name =
DOF_lib.gen_text_antiquotation name (K(K()))
(fn ctxt => Input.source_content
#> Latex.text
#> DOF_lib.enclose_env true ctxt "out")
(* the simplest conversion possible *)
fun boxed_latex_antiquotation name =
DOF_lib.gen_text_antiquotation name (K(K()))
(fn ctxt => Input.source_content
#> Latex.text
#> DOF_lib.enclose_env true ctxt "ltx")
(* the simplest conversion possible *)
fun boxed_bash_antiquotation name =
DOF_lib.gen_text_antiquotation name (K(K()))
(fn ctxt => Input.source_content
#> Latex.text
#> DOF_lib.enclose_env true ctxt "bash")
(* the simplest conversion possible *)
\<close>
setup\<open>boxed_text_antiquotation \<^binding>\<open>boxed_text\<close> #>
boxed_text_antiquotation \<^binding>\<open>boxed_cartouche\<close> #>
boxed_theory_text_antiquotation \<^binding>\<open>boxed_theory_text\<close> #>
boxed_sml_text_antiquotation \<^binding>\<open>boxed_sml\<close> #>
boxed_pdf_antiquotation \<^binding>\<open>boxed_pdf\<close> #>
boxed_latex_antiquotation \<^binding>\<open>boxed_latex\<close>#>
boxed_bash_antiquotation \<^binding>\<open>boxed_bash\<close>
\<close>
(*>*)
@ -71,7 +129,7 @@ abstract*[abs::abstract, keywordlist="[''Ontology'',''Ontological Modeling'',''I
\<close>
section*[intro::introduction]\<open> Introduction \<close>
text*[introtext::introduction]\<open>
text*[introtext::introduction, level = "Some 1"]\<open>
The linking of the \<^emph>\<open>formal\<close> to the \<^emph>\<open>informal\<close> is perhaps the
most pervasive challenge in the digitization of knowledge and its
propagation. This challenge incites numerous research efforts
@ -99,20 +157,18 @@ document evolution. Based on Isabelle infrastructures, ontologies may refer to
types, terms, proven theorems, code, or established assertions.
Based on a novel adaption of the Isabelle IDE, a document is checked to be
\<^emph>\<open>conform\<close> to a particular ontology---\<^isadof> is designed to give fast user-feedback
\<^emph>\<open>during the capture of content\<close>. This is particularly valuable in case of document
\<^emph>\<open>during the capture of content\<close>. This is particularly valuable for document
changes, where the \<^emph>\<open>coherence\<close> between the formal and the informal parts of the
content can be mechanically checked.
To avoid any misunderstanding: \<^isadof> is \<^emph>\<open>not a theory in HOL\<close>
on ontologies and operations to track and trace links in texts,
it is an \<^emph>\<open>environment to write structured text\<close> which \<^emph>\<open>may contain\<close>
\<^isabelle> definitions and proofs like mathematical articles, tech-reports and
scientific papers---as the present one, which is written in \<^isadof>
itself. \<^isadof> is a plugin into the Isabelle/Isar
framework in the style of~@{cite "wenzel.ea:building:2007"}.
To avoid any misunderstanding: \<^isadof> is \<^emph>\<open>not a theory in HOL\<close> on ontologies and operations
to track and trace links in texts, it is an \<^emph>\<open>environment to write structured text\<close> which
\<^emph>\<open>may contain\<close> \<^isabelle> definitions and proofs like mathematical articles, tech-reports and
scientific papers---as the present one, which is written in \<^isadof> itself. \<^isadof> is a plugin
into the Isabelle/Isar framework in the style of~@{cite "wenzel.ea:building:2007"}.
\<close>
(* declaring the forward references used in the subsequent section *)
(* declaring the forward references used in the subsequent sections *)
(*<*)
declare_reference*[bgrnd::text_section]
declare_reference*[isadof::text_section]
@ -120,29 +176,25 @@ declare_reference*[ontomod::text_section]
declare_reference*[ontopide::text_section]
declare_reference*[conclusion::text_section]
(*>*)
text*[plan::introduction]\<open> The plan of the paper is follows: we start by introducing the underlying
Isabelle system (@{text_section (unchecked) \<open>bgrnd\<close>}) followed by presenting the
essentials of \<^isadof> and its ontology language (@{text_section (unchecked) \<open>isadof\<close>}).
text*[plan::introduction, level="Some 1"]\<open> The plan of the paper is as follows: we start by
introducing the underlying Isabelle system (@{text_section (unchecked) \<open>bgrnd\<close>}) followed by
presenting the essentials of \<^isadof> and its ontology language (@{text_section (unchecked) \<open>isadof\<close>}).
It follows @{text_section (unchecked) \<open>ontomod\<close>}, where we present three application
scenarios from the point of view of the ontology modeling. In @{text_section (unchecked) \<open>ontopide\<close>}
we discuss the user-interaction generated from the ontological definitions. Finally, we draw
conclusions and discuss related work in @{text_section (unchecked) \<open>conclusion\<close>}. \<close>
section*[bgrnd::text_section,main_author="Some(@{docitem ''bu''}::author)"]
section*[bgrnd::text_section,main_author="Some(@{author ''bu''}::author)"]
\<open> Background: The Isabelle System \<close>
text*[background::introduction]\<open>
While Isabelle is widely perceived as an interactive theorem prover
for HOL (Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we
would like to emphasize the view that Isabelle is far more than that:
it is the \<^emph>\<open>Eclipse of Formal Methods Tools\<close>. This refers to the
``\<^slanted_text>\<open>generic system framework of Isabelle/Isar underlying recent
versions of Isabelle. Among other things, Isar provides an
infrastructure for Isabelle plug-ins, comprising extensible state
components and extensible syntax that can be bound to ML
programs. Thus, the Isabelle/Isar architecture may be understood as
an extension and refinement of the traditional `LCF approach', with
explicit infrastructure for building derivative
\<^emph>\<open>systems\<close>.\<close>''~@{cite "wenzel.ea:building:2007"}
text*[background::introduction, level="Some 1"]\<open>
While Isabelle is widely perceived as an interactive theorem prover for HOL
(Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we would like to emphasize the view that
Isabelle is far more than that: it is the \<^emph>\<open>Eclipse of Formal Methods Tools\<close>. This refers to the
``\<^slanted_text>\<open>generic system framework of Isabelle/Isar underlying recent versions of Isabelle.
Among other things, Isar provides an infrastructure for Isabelle plug-ins, comprising extensible
state components and extensible syntax that can be bound to ML programs. Thus, the Isabelle/Isar
architecture may be understood as an extension and refinement of the traditional `LCF approach',
with explicit infrastructure for building derivative \<^emph>\<open>systems\<close>.\<close>''~@{cite "wenzel.ea:building:2007"}
The current system framework offers moreover the following features:
@ -154,12 +206,12 @@ The current system framework offers moreover the following features:
the most prominent and deeply integrated system component.
\<close>
figure*[architecture::figure,relative_width="100",src="''figures/isabelle-architecture''"]\<open>
figure*[architecture::figure,relative_width="100",file_src="''figures/isabelle-architecture.pdf''"]\<open>
The system architecture of Isabelle (left-hand side) and the
asynchronous communication between the Isabelle system and
the IDE (right-hand side). \<close>
text*[blug::introduction]\<open> The Isabelle system architecture shown in @{figure \<open>architecture\<close>}
text*[blug::introduction, level="Some 1"]\<open> The Isabelle system architecture shown in @{figure \<open>architecture\<close>}
comes with many layers, with Standard ML (SML) at the bottom layer as implementation
language. The architecture actually foresees a \<^emph>\<open>Nano-Kernel\<close> (our terminology) which
resides in the SML structure \<^ML_structure>\<open>Context\<close>. This structure provides a kind of container called
@ -169,41 +221,39 @@ automated proof procedures as well as specific support for higher specification
were built. \<close>
text\<open> We would like to detail the documentation generation of the architecture,
which is based on literate specification commands such as \inlineisar+section+ \<^dots>,
\inlineisar+subsection+ \<^dots>, \inlineisar+text+ \<^dots>, etc.
which is based on literate specification commands such as \<^theory_text>\<open>section\<close> \<^dots>,
\<^theory_text>\<open>subsection\<close> \<^dots>, \<^theory_text>\<open>text\<close> \<^dots>, etc.
Thus, a user can add a simple text:
\begin{isar}
text\<Open>This is a description.\<Close>
\end{isar}
@{boxed_theory_text [display]\<open>
text\<open> This is a description.\<close>\<close>}
These text-commands can be arbitrarily mixed with other commands stating definitions, proofs, code, etc.,
and will result in the corresponding output in generated \<^LaTeX> or HTML documents.
Now, \<^emph>\<open>inside\<close> the textual content, it is possible to embed a \<^emph>\<open>text-antiquotation\<close>:
\begin{isar}
text\<Open>According to the reflexivity axiom \at{thm refl}, we obtain in \<Gamma>
for \at{term "fac 5"} the result \at{value "fac 5"}.\<Close>
\end{isar}
@{boxed_theory_text [display]\<open>
text\<open> According to the \<^emph>\<open>reflexivity\<close> axiom @{thm refl},
we obtain in \<Gamma> for @{term "fac 5"} the result @{value "fac 5"}.\<close>\<close>}
which is represented in the generated output by:
\begin{out}
According to the reflexivity axiom $x = x$, we obtain in $\Gamma$ for $\operatorname{fac} 5$ the result $120$.
\end{out}
where \inlineisar+refl+ is actually the reference to the axiom of reflexivity in HOL.
For the antiquotation \inlineisar+\at{value "fac 5"}+ we assume the usual definition for
\inlineisar+fac+ in HOL.
@{boxed_pdf [display]\<open>According to the reflexivity axiom $x = x$, we obtain in $\Gamma$ for $\operatorname{fac} 5$ the result $120$.\<close>}
where \<^theory_text>\<open>refl\<close> is actually the reference to the axiom of reflexivity in HOL.
For the antiquotation \<^theory_text>\<open>@{value "''fac 5''"}\<close> we assume the usual definition for
\<^theory_text>\<open>fac\<close> in HOL.
\<close>
text*[anti]\<open> Thus, antiquotations can refer to formal content, can be type-checked before being
displayed and can be used for calculations before actually being typeset. When editing,
Isabelle's PIDE offers auto-completion and error-messages while typing the above
\<^emph>\<open>semi-formal\<close> content. \<close>
text*[anti::introduction, level = "Some 1"]\<open> Thus, antiquotations can refer to formal content,
can be type-checked before being displayed and can be used for calculations before actually being
typeset. When editing, Isabelle's PIDE offers auto-completion and error-messages while typing the
above \<^emph>\<open>semi-formal\<close> content.\<close>
section*[isadof::technical,main_author="Some(@{docitem ''adb''}::author)"]\<open> \<^isadof> \<close>
section*[isadof::technical,main_author="Some(@{author ''adb''}::author)"]\<open> \<^isadof> \<close>
text\<open> An \<^isadof> document consists of three components:
\<^item> the \<^emph>\<open>ontology definition\<close> which is an Isabelle theory file with definitions
for document-classes and all auxiliary datatypes.
\<^item> the \<^emph>\<open>core\<close> of the document itself which is an Isabelle theory
importing the ontology definition. \<^isadof> provides an own family of text-element
commands such as \inlineisar+title*+, \inlineisar+section*+, \inlineisar+text*+, etc.,
commands such as \<^theory_text>\<open>title*\<close>, \<^theory_text>\<open>section*\<close>, \<^theory_text>\<open>text*\<close>, etc.,
which can be annotated with meta-information defined in the underlying ontology definition.
\<^item> the \<^emph>\<open>layout definition\<close> for the given ontology exploiting this meta-information.
\<close>
@ -212,7 +262,7 @@ three parts. Note that the document core \<^emph>\<open>may\<close>, but \<^emph
use Isabelle definitions or proofs for checking the formal content---the
present paper is actually an example of a document not containing any proof.
The document generation process of \<^isadof> is currently restricted to \LaTeX, which means
The document generation process of \<^isadof> is currently restricted to \<^LaTeX>, which means
that the layout is defined by a set of \<^LaTeX> style files. Several layout
definitions for one ontology are possible and pave the way that different \<^emph>\<open>views\<close> for
the same central document were generated, addressing the needs of different purposes `
@ -226,65 +276,47 @@ style-files (\<^verbatim>\<open>.sty\<close>-files). In the document core author
their source, but this limits the possibility of using different representation technologies,
\<^eg>, HTML, and increases the risk of arcane error-messages in generated \<^LaTeX>.
The \<^isadof> ontology specification language consists basically on a notation for
document classes, where the attributes were typed with HOL-types and can be instantiated
by terms HOL-terms, \<^ie>, the actual parsers and type-checkers of the Isabelle system were reused.
This has the particular advantage that \<^isadof> commands can be arbitrarily mixed with
Isabelle/HOL commands providing the machinery for type declarations and term specifications such
as enumerations. In particular, document class definitions provide:
The \<^isadof> ontology specification language consists basically on a notation for document classes,
where the attributes were typed with HOL-types and can be instantiated by terms HOL-terms, \<^ie>,
the actual parsers and type-checkers of the Isabelle system were reused. This has the particular
advantage that \<^isadof> commands can be arbitrarily mixed with Isabelle/HOL commands providing the
machinery for type declarations and term specifications such as enumerations. In particular,
document class definitions provide:
\<^item> a HOL-type for each document class as well as inheritance,
\<^item> support for attributes with HOL-types and optional default values,
\<^item> support for overriding of attribute defaults but not overloading, and
\<^item> text-elements annotated with document classes; they are mutable
instances of document classes.
\<close>
instances of document classes.\<close>
text\<open>
Attributes referring to other ontological concepts are called \<^emph>\<open>links\<close>.
The HOL-types inside the document specification language support built-in types for Isabelle/HOL
\inlineisar+typ+'s, \inlineisar+term+'s, and \inlineisar+thm+'s reflecting internal Isabelle's
internal types for these entities; when denoted in HOL-terms to instantiate an attribute, for
example, there is a specific syntax (called \<^emph>\<open>inner syntax antiquotations\<close>) that is checked by
\<^isadof> for consistency.
Attributes referring to other ontological concepts are called \<^emph>\<open>links\<close>. The HOL-types inside the
document specification language support built-in types for Isabelle/HOL \<^theory_text>\<open>typ\<close>'s, \<^theory_text>\<open>term\<close>'s, and
\<^theory_text>\<open>thm\<close>'s reflecting internal Isabelle's internal types for these entities; when denoted in
HOL-terms to instantiate an attribute, for example, there is a specific syntax
(called \<^emph>\<open>inner syntax antiquotations\<close>) that is checked by \<^isadof> for consistency.
Document classes can have a \inlineisar+where+ clause containing a regular
expression over class names. Classes with such a \inlineisar+where+ were called \<^emph>\<open>monitor classes\<close>.
While document classes and their inheritance relation structure meta-data of text-elements
in an object-oriented manner, monitor classes enforce structural organization
of documents via the language specified by the regular expression
enforcing a sequence of text-elements that must belong to the corresponding classes.
To start using \<^isadof>, one creates an Isabelle project (with the name
\inlinebash{IsaDofApplications}):
\begin{bash}
isabelle dof_mkroot -o scholarly_paper -t lncs IsaDofApplications
\end{bash}
where the \inlinebash{-o scholarly_paper} specifies the ontology for writing scientific articles and
\inlinebash{-t lncs} specifies the use of Springer's \LaTeX-configuration for the Lecture Notes in
Computer Science series. The project can be formally checked, including the generation of the
article in PDF using the following command:
\begin{bash}
isabelle build -d . IsaDofApplications
\end{bash}
\<close>
Document classes can have a \<^theory_text>\<open>where\<close> clause containing a regular expression over class names.
Classes with such a \<^theory_text>\<open>where\<close> were called \<^emph>\<open>monitor classes\<close>. While document classes and their
inheritance relation structure meta-data of text-elements in an object-oriented manner, monitor
classes enforce structural organization of documents via the language specified by the regular
expression enforcing a sequence of text-elements that belong to the corresponding classes. \<^vs>\<open>-0.4cm\<close>\<close>
section*[ontomod::text_section]\<open> Modeling Ontologies in \<^isadof> \<close>
text\<open> In this section, we will use the \<^isadof> document ontology language
for three different application scenarios: for scholarly papers, for mathematical
exam sheets as well as standardization documents where the concepts of the
standard are captured in the ontology. For space reasons, we will concentrate in all three
cases on aspects of the modeling due to space limitations.\<close>
text\<open> In this section, we will use the \<^isadof> document ontology language for three different
application scenarios: for scholarly papers, for mathematical exam sheets as well as standardization
documents where the concepts of the standard are captured in the ontology. For space reasons, we
will concentrate in all three cases on aspects of the modeling due to space limitations.\<close>
subsection*[scholar_onto::example]\<open> The Scholar Paper Scenario: Eating One's Own Dog Food. \<close>
text\<open> The following ontology is a simple ontology modeling scientific papers. In this
\<^isadof> application scenario, we deliberately refrain from integrating references to
(Isabelle) formal content in order demonstrate that \<^isadof> is not a framework from
Isabelle users to Isabelle users only.
Of course, such references can be added easily and represent a particular strength
of \<^isadof>.
Isabelle users to Isabelle users only. Of course, such references can be added easily and
represent a particular strength of \<^isadof>.\<close>
\begin{figure}
\begin{isar}
text*["paper_onto_core"::float,
main_caption="\<open>The core of the ontology definition for writing scholarly papers.\<close>"]
\<open>@{boxed_theory_text [display]\<open>
doc_class title =
short_title :: "string option" <= None
@ -299,64 +331,62 @@ doc_class abstract =
doc_class text_section =
main_author :: "author option" <= None
todo_list :: "string list" <= "[]"
\end{isar}
\caption{The core of the ontology definition for writing scholarly papers.}
\label{fig:paper-onto-core}
\end{figure}
The first part of the ontology \inlineisar+scholarly_paper+ (see \autoref{fig:paper-onto-core})
todo_list :: "string list" <= "[]"
\<close>}\<close>
text\<open> The first part of the ontology \<^theory_text>\<open>scholarly_paper\<close>
(see @{float "paper_onto_core"})
contains the document class definitions
with the usual text-elements of a scientific paper. The attributes \inlineisar+short_title+,
\inlineisar+abbrev+ etc are introduced with their types as well as their default values.
Our model prescribes an optional \inlineisar+main_author+ and a todo-list attached to an arbitrary
with the usual text-elements of a scientific paper. The attributes \<^theory_text>\<open>short_title\<close>,
\<^theory_text>\<open>abbrev\<close> etc are introduced with their types as well as their default values.
Our model prescribes an optional \<^theory_text>\<open>main_author\<close> and a todo-list attached to an arbitrary
text section; since instances of this class are mutable (meta)-objects of text-elements, they
can be modified arbitrarily through subsequent text and of course globally during text evolution.
Since \inlineisar+author+ is a HOL-type internally generated by \<^isadof> framework and can therefore
appear in the \inlineisar+main_author+ attribute of the \inlineisar+text_section+ class;
Since \<^theory_text>\<open>author\<close> is a HOL-type internally generated by \<^isadof> framework and can therefore
appear in the \<^theory_text>\<open>main_author\<close> attribute of the \<^theory_text>\<open>text_section\<close> class;
semantic links between concepts can be modeled this way.
The translation of its content to, \<^eg>, Springer's \<^LaTeX> setup for the Lecture Notes in Computer
Science Series, as required by many scientific conferences, is mostly straight-forward. \<close>
Science Series, as required by many scientific conferences, is mostly straight-forward.
\<^vs>\<open>-0.8cm\<close>\<close>
figure*[fig1::figure,spawn_columns=False,relative_width="95",src="''figures/Dogfood-Intro''"]
figure*[fig1::figure,relative_width="95",file_src="''figures/Dogfood-Intro.png''"]
\<open> Ouroboros I: This paper from inside \<^dots> \<close>
text\<open> @{figure \<open>fig1\<close>} shows the corresponding view in the Isabelle/PIDE of thqqe present paper.
(*<*)declare_reference*[paper_onto_sections::float](*>*)
text\<open>\<^vs>\<open>-0.8cm\<close> @{figure \<open>fig1\<close>} shows the corresponding view in the Isabelle/PIDE of the present paper.
Note that the text uses \<^isadof>'s own text-commands containing the meta-information provided by
the underlying ontology.
We proceed by a definition of \inlineisar+introduction+'s, which we define as the extension of
\inlineisar+text_section+ which is intended to capture common infrastructure:
\begin{isar}
We proceed by a definition of \<^theory_text>\<open>introduction\<close>'s, which we define as the extension of
\<^theory_text>\<open>text_section\<close> which is intended to capture common infrastructure:
@{boxed_theory_text [display]\<open>
doc_class introduction = text_section +
comment :: string
\end{isar}
As a consequence of the definition as extension, the \inlineisar+introduction+ class
inherits the attributes \inlineisar+main_author+ and \inlineisar+todo_list+ together with
\<close>}
As a consequence of the definition as extension, the \<^theory_text>\<open>introduction\<close> class
inherits the attributes \<^theory_text>\<open>main_author\<close> and \<^theory_text>\<open>todo_list\<close> together with
the corresponding default values.
As a variant of the introduction, we could add here an attribute that contains the formal
claims of the article --- either here, or, for example, in the keyword list of the abstract.
As type, one could use either the built-in type \inlineisar+term+ (for syntactically correct,
but not necessarily proven entity) or \inlineisar+thm+ (for formally proven entities). It suffices
As type, one could use either the built-in type \<^theory_text>\<open>term\<close> (for syntactically correct,
but not necessarily proven entity) or \<^theory_text>\<open>thm\<close> (for formally proven entities). It suffices
to add the line:
\begin{isar}
@{boxed_theory_text [display]\<open>
claims :: "thm list"
\end{isar}
and to extent the \LaTeX-style accordingly to handle the additional field.
Note that \inlineisar+term+ and \inlineisar+thm+ are types reflecting the core-types of the
\<close>}
and to extent the \<^LaTeX>-style accordingly to handle the additional field.
Note that \<^theory_text>\<open>term\<close> and \<^theory_text>\<open>thm\<close> are types reflecting the core-types of the
Isabelle kernel. In a corresponding conclusion section, one could model analogously an
achievement section; by programming a specific compliance check in SML, the implementation
of automated forms of validation check for specific categories of papers is envisageable.
Since this requires deeper knowledge in Isabelle programming, however, we consider this out
of the scope of this paper.
We proceed more or less conventionally by the subsequent sections (\autoref{fig:paper-onto-sections})
\begin{figure}
\begin{isar}
doc_class technical = text_section +
definition_list :: "string list" <= "[]"
We proceed more or less conventionally by the subsequent sections (@{float (unchecked)\<open>paper_onto_sections\<close>})\<close>
text*["paper_onto_sections"::float,
main_caption = "''Various types of sections of a scholarly papers.''"]\<open>
@{boxed_theory_text [display]\<open>
doc_class example = text_section +
comment :: string
@ -368,14 +398,13 @@ doc_class related_work = conclusion +
doc_class bibliography =
style :: "string option" <= "''LNCS''"
\end{isar}
\caption{Various types of sections of a scholarly papers.}
\label{fig:paper-onto-sections}
\end{figure}
and finish with a monitor class definition that enforces a textual ordering
in the document core by a regular expression (\autoref{fig:paper-onto-monitor}).
\begin{figure}
\begin{isar}
\<close>}\<close>
(*<*)declare_reference*[paper_onto_monitor::float](*>*)
text\<open>... and finish with a monitor class definition that enforces a textual ordering
in the document core by a regular expression (@{float (unchecked) "paper_onto_monitor"}).\<close>
text*["paper_onto_monitor"::float,
main_caption = "''A monitor for the scholarly paper ontology.''"]\<open>
@{boxed_theory_text [display]\<open>
doc_class article =
trace :: "(title + subtitle + author+ abstract +
introduction + technical + example +
@ -383,23 +412,20 @@ doc_class article =
where "(title ~~ \<lbrakk>subtitle\<rbrakk> ~~ \<lbrace>author\<rbrace>$^+$+ ~~ abstract ~~
introduction ~~ \<lbrace>technical || example\<rbrace>$^+$ ~~ conclusion ~~
bibliography)"
\end{isar}
\caption{A monitor for the scholarly paper ontology.}
\label{fig:paper-onto-monitor}
\end{figure}
\<close>}
\<close>
text\<open> We might wish to add a component into our ontology that models figures to be included into
the document. This boils down to the exercise of modeling structured data in the style of a
functional programming language in HOL and to reuse the implicit HOL-type inside a suitable document
class \inlineisar+figure+:
\begin{isar}
class \<^theory_text>\<open>figure\<close>:
@{boxed_theory_text [display]\<open>
datatype placement = h | t | b | ht | hb
doc_class figure = text_section +
relative_width :: "int" (* percent of textwidth *)
src :: "string"
placement :: placement
spawn_columns :: bool <= True
\end{isar}
\<close>}
\<close>
text\<open> Alternatively, by including the HOL-libraries for rationals, it is possible to
@ -407,11 +433,11 @@ use fractions or even mathematical reals. This must be counterbalanced by syntac
and semantic convenience. Choosing the mathematical reals, \<^eg>, would have the drawback that
attribute evaluation could be substantially more complicated.\<close>
figure*[fig_figures::figure,spawn_columns=False,relative_width="85",src="''figures/Dogfood-figures''"]
figure*[fig_figures::figure,relative_width="85",file_src="''figures/Dogfood-figures.png''"]
\<open> Ouroboros II: figures \<^dots> \<close>
text\<open> The document class \inlineisar+figure+ --- supported by the \<^isadof> text command
\inlineisar+figure*+ --- makes it possible to express the pictures and diagrams in this paper
text\<open> The document class \<^theory_text>\<open>figure\<close> --- supported by the \<^isadof> text command
\<^theory_text>\<open>figure*\<close> --- makes it possible to express the pictures and diagrams in this paper
such as @{figure \<open>fig_figures\<close>}.
\<close>
@ -434,10 +460,10 @@ We assume that the content has four different types of addressees, which have a
text\<open> The latter quality assurance mechanism is used in many universities,
where for organizational reasons the execution of an exam takes place in facilities
where the author of the exam is not expected to be physically present.
Furthermore, we assume a simple grade system (thus, some calculation is required).
\begin{figure}
\begin{isar}
Furthermore, we assume a simple grade system (thus, some calculation is required). \<close>
text*["onto_exam"::float,
main_caption = "''The core of the ontology modeling math exams.''"]\<open>
@{boxed_theory_text [display]\<open>
doc_class Author = ...
datatype Subject = algebra | geometry | statistical
datatype Grade = A1 | A2 | A3
@ -459,18 +485,18 @@ doc_class Exam_item =
concerns :: "ContentClass set"
type_synonym SubQuestion = string
\end{isar}
\caption{The core of the ontology modeling math exams.}
\label{fig:onto-exam}
\end{figure}
The heart of this ontology (see \autoref{fig:onto-exam}) is an alternation of questions and answers,
\<close>}\<close>
(*<*)declare_reference*[onto_questions::float](*>*)
text\<open>The heart of this ontology (see @{float "onto_exam"}) is an alternation of questions and answers,
where the answers can consist of simple yes-no answers (QCM style check-boxes) or lists of formulas.
Since we do not
assume familiarity of the students with Isabelle (\inlineisar+term+ would assume that this is a
assume familiarity of the students with Isabelle (\<^theory_text>\<open>term\<close> would assume that this is a
parse-able and type-checkable entity), we basically model a derivation as a sequence of strings
(see \autoref{fig:onto-questions}).
\begin{figure}
\begin{isar}
(see @{float (unchecked)"onto_questions"}).\<close>
text*["onto_questions"::float,
main_caption = "''An exam can contain different types of questions.''"]\<open>
@{boxed_theory_text [display]\<open>
doc_class Answer_Formal_Step = Exam_item +
justification :: string
"term" :: "string"
@ -494,19 +520,18 @@ doc_class Exercise = Exam_item +
content :: "(Task) list"
concerns :: "ContentClass set" <= "UNIV"
mark :: int
\end{isar}
\caption{An exam can contain different types of questions.}
\label{fig:onto-questions}
\end{figure}
\<close>}\<close>
(*<*)declare_reference*[onto_exam_monitor::float](*>*)
text\<open>
In many institutions, it makes sense to have a rigorous process of validation
for exam subjects: is the initial question correct? Is a proof in the sense of the
question possible? We model the possibility that the @{term examiner} validates a
question by a sample proof validated by Isabelle (see \autoref{fig:onto-exam-monitor}).
question by a sample proof validated by Isabelle (see @{float (unchecked) "onto_exam_monitor"}).
In our scenario this sample proofs are completely \<^emph>\<open>intern\<close>, \<^ie>, not exposed to the
students but just additional material for the internal review process of the exam.
\begin{figure}
\begin{isar}
students but just additional material for the internal review process of the exam.\<close>
text*["onto_exam_monitor"::float,
main_caption = "''Validating exams.''"]\<open>
@{boxed_theory_text [display]\<open>
doc_class Validation =
tests :: "term list" <="[]"
proofs :: "thm list" <="[]"
@ -520,14 +545,9 @@ doc_class MathExam=
content :: "(Header + Author + Exercise) list"
global_grade :: Grade
where "\<lbrace>Author\<rbrace>$^+$ ~~ Header ~~ \<lbrace>Exercise ~~ Solution\<rbrace>$^+$ "
\end{isar}
\caption{Validating exams.}
\label{fig:onto-exam-monitor}
\end{figure}
\<close>
\<close>}\<close>
declare_reference*["fig_qcm"::figure]
(*<*)declare_reference*["fig_qcm"::figure](*>*)
text\<open> Using the \<^LaTeX> package hyperref, it is possible to conceive an interactive
exam-sheets with multiple-choice and/or free-response elements
@ -535,14 +555,14 @@ exam-sheets with multiple-choice and/or free-response elements
help of the latter, it is possible that students write in a browser a formal mathematical
derivation---as part of an algebra exercise, for example---which is submitted to the examiners
electronically. \<close>
figure*[fig_qcm::figure,spawn_columns=False,
relative_width="90",src="''figures/InteractiveMathSheet''"]
\<open> A Generated QCM Fragment \<^dots> \<close>
figure*[fig_qcm::figure,
relative_width="90",file_src="''figures/InteractiveMathSheet.png''"]
\<open>A Generated QCM Fragment \<^dots> \<close>
subsection*[cenelec_onto::example]\<open> The Certification Scenario following CENELEC \<close>
text\<open> Documents to be provided in formal certifications (such as CENELEC
50126/50128, the DO-178B/C, or Common Criteria) can much profit from the control of ontological consistency:
a lot of an evaluators work consists in tracing down the links from requirements over
50126/50128, the DO-178B/C, or Common Criteria) can much profit from the control of ontological
consistency: a lot of an evaluators work consists in tracing down the links from requirements over
assumptions down to elements of evidence, be it in the models, the code, or the tests.
In a certification process, traceability becomes a major concern; and providing
mechanisms to ensure complete traceability already at the development of the
@ -554,15 +574,17 @@ of developments targeting certifications. Continuously checking the links betwee
and the semi-formal parts of such documents is particularly valuable during the (usually
collaborative) development effort.
As in many other cases, formal certification documents come with an own terminology and
pragmatics of what has to be demonstrated and where, and how the trace-ability of requirements through
As in many other cases, formal certification documents come with an own terminology and pragmatics
of what has to be demonstrated and where, and how the trace-ability of requirements through
design-models over code to system environment assumptions has to be assured.
\<close>
(*<*)declare_reference*["conceptual"::float](*>*)
text\<open> In the sequel, we present a simplified version of an ontological model used in a
case-study~ @{cite "bezzecchi.ea:making:2018"}. We start with an introduction of the concept of requirement
(see \autoref{fig:conceptual}).
\begin{figure}
\begin{isar}
(see @{float (unchecked) "conceptual"}). \<close>
text*["conceptual"::float,
main_caption = "''Modeling requirements.''"]\<open>
@{boxed_theory_text [display]\<open>
doc_class requirement = long_name :: "string option"
doc_class requirement_analysis = no :: "nat"
@ -575,11 +597,9 @@ datatype ass_kind = informal | semiformal | formal
doc_class assumption = requirement +
assumption_kind :: ass_kind <= informal
\end{isar}
\caption{Modeling requirements.}
\label{fig:conceptual}
\end{figure}
Such ontologies can be enriched by larger explanations and examples, which may help
\<close>}\<close>
text\<open>Such ontologies can be enriched by larger explanations and examples, which may help
the team of engineers substantially when developing the central document for a certification,
like an explication what is precisely the difference between an \<^emph>\<open>hypothesis\<close> and an
\<^emph>\<open>assumption\<close> in the context of the evaluation standard. Since the PIDE makes for each
@ -601,71 +621,70 @@ is the category \<^emph>\<open>safety related application condition\<close> (or
for short) which is used for \<^emph>\<open>ec\<close>'s that establish safety properties
of the evaluation target. Their track-ability throughout the certification
is therefore particularly critical. This is naturally modeled as follows:
\begin{isar}
@{boxed_theory_text [display]\<open>
doc_class ec = assumption +
assumption_kind :: ass_kind <= (*default *) formal
doc_class srac = ec +
assumption_kind :: ass_kind <= (*default *) formal
\end{isar}
\<close>}
\<close>
section*[ontopide::technical]\<open> Ontology-based IDE support \<close>
text\<open> We present a selection of interaction scenarios @{example \<open>scholar_onto\<close>}
and @{example \<open>cenelec_onto\<close>} with Isabelle/PIDE instrumented by \<^isadof>. \<close>
(*<*)
declare_reference*["text_elements"::float]
declare_reference*["hyperlinks"::float]
(*>*)
subsection*[scholar_pide::example]\<open> A Scholarly Paper \<close>
text\<open> In \autoref{fig-Dogfood-II-bgnd1} and \autoref{fig-bgnd-text_section} we show how
text\<open> In @{float (unchecked) "text_elements"}~(a)
and @{float (unchecked) "text_elements"}~(b)we show how
hovering over links permits to explore its meta-information.
Clicking on a document class identifier permits to hyperlink into the corresponding
class definition (\autoref{fig:Dogfood-IV-jumpInDocCLass}); hovering over an attribute-definition
(which is qualified in order to disambiguate; \autoref{fig:Dogfood-V-attribute}).
class definition (@{float (unchecked) "hyperlinks"}~(a)); hovering over an attribute-definition
(which is qualified in order to disambiguate; @{float (unchecked) "hyperlinks"}~(b)).
\<close>
side_by_side_figure*["text-elements"::side_by_side_figure,anchor="''fig-Dogfood-II-bgnd1''",
caption="''Exploring a Reference of a Text-Element.''",relative_width="48",
src="''figures/Dogfood-II-bgnd1''",anchor2="''fig-bgnd-text_section''",
caption2="''Exploring the class of a text element.''",relative_width2="47",
src2="''figures/Dogfood-III-bgnd-text_section''"]\<open> Exploring text elements. \<close>
side_by_side_figure*["hyperlinks"::side_by_side_figure,anchor="''fig:Dogfood-IV-jumpInDocCLass''",
caption="''Hyperlink to Class-Definition.''",relative_width="48",
src="''figures/Dogfood-IV-jumpInDocCLass''",anchor2="''fig:Dogfood-V-attribute''",
caption2="''Exploring an attribute.''",relative_width2="47",
src2="''figures/Dogfood-III-bgnd-text_section''"]\<open> Hyperlinks.\<close>
text*["text_elements"::float,
main_caption="\<open>Exploring text elements.\<close>"]
\<open>
@{fig_content (width=53, height=5, caption="Exploring a reference of a text element.") "figures/Dogfood-II-bgnd1.png"
}\<^hfill>@{fig_content (width=47, height=5, caption="Exploring the class of a text element.") "figures/Dogfood-III-bgnd-text_section.png"}
\<close>
text*["hyperlinks"::float,
main_caption="\<open>Hyperlinks.\<close>"]
\<open>
@{fig_content (width=48, caption="Hyperlink to Class-Definition.") "figures/Dogfood-IV-jumpInDocCLass.png"
}\<^hfill>@{fig_content (width=47, caption="Exploring an attribute.") "figures/Dogfood-V-attribute.png"}
\<close>
declare_reference*["figDogfoodVIlinkappl"::figure]
text\<open> An ontological reference application in \autoref{figDogfoodVIlinkappl}: the ontology-dependant
antiquotation \inlineisar|@ {example ...}| refers to the corresponding text-elements. Hovering allows
for inspection, clicking for jumping to the definition. If the link does not exist or has a
non-compatible type, the text is not validated. \<close>
figure*[figDogfoodVIlinkappl::figure,relative_width="80",src="''figures/Dogfood-V-attribute''"]
\<open> Exploring an attribute (hyperlinked to the class). \<close>
subsection*[cenelec_pide::example]\<open> CENELEC \<close>
declare_reference*[figfig3::figure]
text\<open> The corresponding view in @{docitem (unchecked) \<open>figfig3\<close>} shows core part of a document,
(*<*)declare_reference*[figfig3::figure](*>*)
text\<open> The corresponding view in @{figure (unchecked) \<open>figfig3\<close>} shows core part of a document,
coherent to the @{example \<open>cenelec_onto\<close>}. The first sample shows standard Isabelle antiquotations
@{cite "wenzel:isabelle-isar:2017"} into formal entities of a theory. This way, the informal parts
of a document get ``formal content'' and become more robust under change.\<close>
figure*[figfig3::figure,relative_width="80",src="''figures/antiquotations-PIDE''"]
figure*[figfig3::figure,relative_width="80",file_src="''figures/antiquotations-PIDE.png''"]
\<open> Standard antiquotations referring to theory elements.\<close>
declare_reference*[figfig5::figure]
(*<*)declare_reference*[figfig5::figure] (*>*)
text\<open> The subsequent sample in @{figure (unchecked) \<open>figfig5\<close>} shows the definition of an
\<^emph>\<open>safety-related application condition\<close>, a side-condition of a theorem which
has the consequence that a certain calculation must be executed sufficiently fast on an embedded
device. This condition can not be established inside the formal theory but has to be
checked by system integration tests.\<close>
figure*[figfig5::figure, relative_width="80", src="''figures/srac-definition''"]
figure*[figfig5::figure, relative_width="80", file_src="''figures/srac-definition.png''"]
\<open> Defining a SRAC reference \<^dots> \<close>
figure*[figfig7::figure, relative_width="80", src="''figures/srac-as-es-application''"]
figure*[figfig7::figure, relative_width="80", file_src="''figures/srac-as-es-application.png''"]
\<open> Using a SRAC as EC document reference. \<close>
text\<open> Now we reference in @{figure (unchecked) \<open>figfig7\<close>} this safety-related condition;
text\<open> Now we reference in @{figure \<open>figfig7\<close>} this safety-related condition;
however, this happens in a context where general \<^emph>\<open>exported constraints\<close> are listed.
\<^isadof>'s checks establish that this is legal in the given ontology.
@ -677,7 +696,7 @@ informal parts. \<close>
section*[onto_future::technical]\<open> Monitor Classes \<close>
text\<open> Besides sub-typing, there is another relation between
document classes: a class can be a \<^emph>\<open>monitor\<close> to other ones,
which is expressed by the occurrence of a \inlineisar+where+ clause
which is expressed by the occurrence of a @{theory_text \<open>where\<close>} clause
in the document class definition containing a regular
expression (see @{example \<open>scholar_onto\<close>}).
While class-extension refers to data-inheritance of attributes,
@ -686,8 +705,8 @@ in which instances of monitored classes may occur. \<close>
text\<open>
The control of monitors is done by the commands:
\<^item> \inlineisar+open_monitor* + <doc-class>
\<^item> \inlineisar+close_monitor* + <doc-class>
\<^item> \<^theory_text>\<open>open_monitor*\<close> \<^emph>\<open><doc-class>\<close>
\<^item> \<^theory_text>\<open>close_monitor*\<close> \<^emph>\<open><doc-class>\<close>
\<close>
text\<open>
where the automaton of the monitor class is expected to be in a final state. In the final state,
@ -735,8 +754,7 @@ work in this area we are aware of is rOntorium~@{cite "rontorium"}, a plugin
for \<^Protege> that integrates R~@{cite "adler:r:2010"} into an
ontology environment. Here, the main motivation behind this
integration is to allow for statistically analyze ontological
documents. Thus, this is complementary to our work.
\<close>
documents. Thus, this is complementary to our work.\<close>
text\<open> \<^isadof> in its present form has a number of technical short-comings as well
as potentials not yet explored. On the long list of the short-comings is the

View File

@ -1,15 +1,14 @@
session "2018-cicm-isabelle_dof-applications" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof,
dof_ontologies = "Isabelle_DOF.scholarly_paper", dof_template = Isabelle_DOF.lncs,
quick_and_dirty = true]
chapter AFP
session "Isabelle_DOF-Example-I" (AFP) = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
theories
IsaDofApplications
document_files
"root.bib"
"authorarchive.sty"
"preamble.tex"
"lstisadof.sty"
"vector_iD_icon.pdf"
"lstisadof-manual.sty"
"figures/isabelle-architecture.pdf"
"figures/Dogfood-Intro.png"
"figures/InteractiveMathSheet.png"

View File

@ -1,4 +1,4 @@
%% Copyright (C) 2008-2019 Achim D. Brucker, https://www.brucker.ch
%% Copyright (C) 2008-2023 Achim D. Brucker, https://www.brucker.ch
%%
%% License:
%% This program can be redistributed and/or modified under the terms
@ -11,21 +11,22 @@
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
\NeedsTeXFormat{LaTeX2e}\relax
\ProvidesPackage{authorarchive}
[0000/00/00 Unreleased v1.1.1+%
[2023/02/10 v1.3.0
Self-archiving information for scientific publications.]
%
\PassOptionsToPackage{hyphens}{url}
%
\RequirePackage{ifthen}
\RequirePackage[inline]{enumitem}
\RequirePackage{graphicx}
\RequirePackage{orcidlink}
\RequirePackage{eso-pic}
\RequirePackage{intopdf}
\RequirePackage{kvoptions}
\RequirePackage{hyperref}
\RequirePackage{calc}
\RequirePackage{qrcode}
\RequirePackage{hvlogos}
\RequirePackage{etoolbox}
\newrobustcmd\BibTeX{Bib\TeX}
%
%Better url breaking
\g@addto@macro{\UrlBreaks}{\UrlOrds}
@ -80,31 +81,51 @@
}
\ProcessKeyvalOptions*
% Provide command for dynamic configuration seutp
\def\authorsetup{\kvsetkeys{AA}}
\newcommand{\AA@defIncludeFiles}{
\def\AA@bibBibTeX{\AA@bibtexdir/\AA@key.bib}
\def\AA@bibBibTeXLong{\AA@bibtexdir/\AA@key.bibtex}
\def\AA@bibWord{\AA@bibtexdir/\AA@key.word.xml}
\def\AA@bibEndnote{\AA@bibtexdir/\AA@key.enw}
\def\AA@bibRIS{\AA@bibtexdir/\AA@key.ris}
}
\AA@defIncludeFiles
\newboolean{AA@bibExists}
\setboolean{AA@bibExists}{false}
\newcommand{\AA@defIncludeSwitches}{
\IfFileExists{\AA@bibBibTeX}{\setboolean{AA@bibExists}{true}}{}
\IfFileExists{\AA@bibBibTeXLong}{\setboolean{AA@bibExists}{true}}{}
\IfFileExists{\AA@bibWord}{\setboolean{AA@bibExists}{true}}{}
\IfFileExists{\AA@bibEndnote}{\setboolean{AA@bibExists}{true}}{}
\IfFileExists{\AA@bibRIS}{\setboolean{AA@bibExists}{true}}{}
}
\AA@defIncludeSwitches
% Provide command for dynamic configuration setup
% \def\authorsetup{\kvsetkeys{AA}}
\newcommand{\authorsetup}[1]{%
\kvsetkeys{AA}{#1}
\AA@defIncludeFiles
\AA@defIncludeSwitches
}
% Load local configuration
\InputIfFileExists{authorarchive.config}{}{}
% define proxy command for setting PDF attributes
\ExplSyntaxOn
\@ifundefined{pdfmanagement_add:nnn}{%
\newcommand{\AA@pdfpagesattribute}[2]{\pdfpagesattr{/#1 #2}}%
}{%
\newcommand{\AA@pdfpagesattribute}[2]{\pdfmanagement_add:nnn{Pages}{#1}{#2}}%
}%
\ExplSyntaxOff
\newlength\AA@x
\newlength\AA@y
\newlength\AA@width
\def\AA@bibBibTeX{\AA@bibtexdir/\AA@key.bib}
\def\AA@bibBibTeXLong{\AA@bibtexdir/\AA@key.bibtex}
\def\AA@bibWord{\AA@bibtexdir/\AA@key.word.xml}
\def\AA@bibEndnote{\AA@bibtexdir/\AA@key.enw}
\def\AA@bibRIS{\AA@bibtexdir/\AA@key.ris}
\newboolean{AA@bibExists}
\setboolean{AA@bibExists}{false}
\IfFileExists{\AA@bibBibTeX}{\setboolean{AA@bibExists}{true}}{}
\IfFileExists{\AA@bibBibTeXLong}{\setboolean{AA@bibExists}{true}}{}
\IfFileExists{\AA@bibWord}{\setboolean{AA@bibExists}{true}}{}
\IfFileExists{\AA@bibEndnote}{\setboolean{AA@bibExists}{true}}{}
\IfFileExists{\AA@bibRIS}{\setboolean{AA@bibExists}{true}}{}
\setlength\AA@x{1in+\hoffset+\oddsidemargin}
\newcommand{\authorcrfont}{\footnotesize}
@ -148,8 +169,7 @@
%%%% LNCS
\ifAA@LNCS%
\ifAA@orcidicon%
\renewcommand{\orcidID}[1]{\href{https://orcid.org/#1}{%
\textsuperscript{\,\includegraphics[height=2\fontcharht\font`A]{vector_iD_icon}}}}
\renewcommand{\orcidID}[1]{\orcidlink{#1}}
\else\relax\fi%
%
\ifthenelse{\equal{\AA@publisher}{UNKNOWN PUBLISHER}}{%
@ -157,23 +177,11 @@
}{}
\renewcommand{\authorcrfont}{\scriptsize}
\@ifclasswith{llncs}{a4paper}{%
\ExplSyntaxOn
\@ifundefined{pdfmanagement_add:nnn}{%
\pdfpagesattr{/CropBox [92 114 523 780]}%
}{%
\pdfmanagement_add:nnn {Pages}{CropBox}{[92~114~523~780]}
}%
\ExplSyntaxOff
\AA@pdfpagesattribute{CropBox}{[92 114 523 780]}%
\renewcommand{\authorat}[1]{\put(\LenToUnit{\AA@x},40){#1}}%
}{%
\ExplSyntaxOn
\@ifundefined{pdfmanagement_add:nnn}{%
\pdfpagesattr{/CropBox [92 65 523 731]}% LNCS page: 152x235 mm
}{%
\pdfmanagement_add:nnn {Pages}{CropBox}{[92~62~523~731]}
}%
\ExplSyntaxOff
\renewcommand{\authorat}[1]{\put(\LenToUnit{\AA@x},23){#1}}
\AA@pdfpagesattribute{CropBox}{[92 65 523 731]}%
\renewcommand{\authorat}[1]{\put(\LenToUnit{\AA@x},23){#1}}%
}
\setlength{\AA@width}{\textwidth}
\setcounter{tocdepth}{2}
@ -186,7 +194,7 @@
}{}
\renewcommand{\authorat}[1]{\put(\LenToUnit{\AA@x},35){#1}}
\renewcommand{\authorcrfont}{\scriptsize}
\pdfpagesattr{/CropBox [70 65 526.378 748.15]} % TODO
\AA@pdfpagesattribute{CropBox}{[70 65 526.378 748.15]}
\setlength{\AA@width}{\textwidth}
\setcounter{tocdepth}{2}
\fi
@ -218,8 +226,6 @@
draft = false,
bookmarksopen = true,
bookmarksnumbered= true,
pdfauthor = {\@author},
pdftitle = {\@title},
}
\@ifpackageloaded{totpages}{%
@ -305,26 +311,26 @@
\hfill
\begin{itemize*}[label={}, itemjoin={,}]
\IfFileExists{\AA@bibBibTeX}{%
\item \attachandlink{\AA@bibBibTeX}[application/x-bibtex]{BibTeX entry of this paper}{\BibTeX}%
\item \expanded{\attachandlink[\AA@key.bib]{\AA@bibBibTeX}[application/x-bibtex]{BibTeX entry of this paper}{\BibTeX}}%
}{%
\IfFileExists{\AA@bibBibTeXLong}{%
\item \attachandlink[\AA@key.bib]{\AA@bibBibTeXLong}[application/x-bibtex]{BibTeX entry of this paper}{\BibTeX}%
\item \expanded{\attachandlink[\AA@key.bib]{\AA@bibBibTeXLong}[application/x-bibtex]{BibTeX entry of this paper}{\BibTeX}}%
}{%
\typeout{No file \AA@bibBibTeX{} (and no \AA@bibBibTeXLong) found. Not embedded reference in BibTeX format.}%
}%
}%
\IfFileExists{\AA@bibWord}{%
\item \attachandlink{\AA@bibWord}[application/xml]{XML entry of this paper (e.g., for Word 2007 and later)}{Word}%
\item \expanded{\attachandlink[\AA@key.word.xml]{\AA@bibWord}[application/xml]{XML entry of this paper (e.g., for Word 2007 and later)}{Word}}%
}{%
\typeout{No file \AA@bibWord{} found. Not embedded reference for Word 2007 and later.}%
}%
\IfFileExists{\AA@bibEndnote}{%
\item \attachandlink{\AA@bibEndnote}[application/x-endnote-refer]{Endnote entry of this paper}{EndNote}%
\item \expanded{\attachandlink[\AA@key.enw]{\AA@bibEndnote}[application/x-endnote-refer]{Endnote entry of this paper}{EndNote}}%
}{%
\typeout{No file \AA@bibEndnote{} found. Not embedded reference in Endnote format.}%
}%
\IfFileExists{\AA@bibRIS}{%
\item \attachandlink{\AA@bibRIS}[application/x-research-info-systems]{RIS entry of this paper}{RIS}%
\item \expanded{\attachandlink[\AA@key.ris]{\AA@bibRIS}[application/x-research-info-systems]{RIS entry of this paper}{RIS}}%
}{%
\typeout{No file \AA@bibRIS{} found. Not embedded reference in RIS format.}%
}%

View File

@ -90,9 +90,7 @@
,enhanced jigsaw
,borderline west={2pt}{0pt}{isar!60!black}
,sharp corners
,before skip balanced=0.5\baselineskip plus 2pt
% ,before skip=10pt
% ,after skip=10pt
%,before skip balanced=0.5\baselineskip plus 2pt % works only with Tex Live 2020 and later
,enlarge top by=0mm
,enhanced
,overlay={\node[draw,fill=isar!60!black,xshift=0pt,anchor=north
@ -136,11 +134,12 @@
\lstloadlanguages{ML}
\providecolor{sml}{named}{red}
\lstdefinestyle{sml}{
basicstyle=\ttfamily,%
commentstyle=\itshape,%
keywordstyle=\bfseries\color{CornflowerBlue},%
ndkeywordstyle=\color{green},%
language=ML
,escapechar=ë%
,basicstyle=\ttfamily%
,commentstyle=\itshape%
,keywordstyle=\bfseries\color{CornflowerBlue}%
,ndkeywordstyle=\color{green}%
,language=ML
% ,literate={%
% {<@>}{@}1%
% }
@ -150,7 +149,7 @@
,tagstyle=\color{CornflowerBlue}%
,markfirstintag=true%
}%
\def\inlinesml{\lstinline[style=sml,breaklines=true,mathescape,breakatwhitespace=true]}
\def\inlinesml{\lstinline[style=sml,breaklines=true,breakatwhitespace=true]}
\newtcblisting{sml}[1][]{%
listing only%
,boxrule=0pt
@ -170,7 +169,6 @@
style=sml
,columns=flexible%
,basicstyle=\small\ttfamily
,mathescape
,#1
}
}%
@ -296,3 +294,34 @@
}%
%% </bash>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% <config>
\providecolor{config}{named}{gray}
\newtcblisting{config}[2][]{%
listing only%
,boxrule=0pt
,boxsep=0pt
,colback=white!90!config
,enhanced jigsaw
,borderline west={2pt}{0pt}{config!60!black}
,sharp corners
% ,before skip=10pt
% ,after skip=10pt
,enlarge top by=0mm
,enhanced
,overlay={\node[draw,fill=config!60!black,xshift=0pt,anchor=north
east,font=\bfseries\footnotesize\color{white}]
at (frame.north east) {#2};}
,listing options={
breakatwhitespace=true
,columns=flexible%
,basicstyle=\small\ttfamily
,mathescape
,#1
}
}%
%% </config>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

View File

@ -20,39 +20,9 @@
\usepackage{xcolor}
\usepackage{paralist}
\usepackage{listings}
\usepackage{lstisadof}
\usepackage{xspace}
\lstloadlanguages{bash}
\lstdefinestyle{bash}{language=bash,
,basicstyle=\ttfamily%
,showspaces=false%
,showlines=false%
,columns=flexible%
% ,keywordstyle=\bfseries%
% Defining 2-keywords
,keywordstyle=[1]{\color{BrickRed!60}\bfseries}%
% Defining 3-keywords
,keywordstyle=[2]{\color{OliveGreen!60}\bfseries}%
% Defining 4-keywords
,keywordstyle=[3]{\color{black!60}\bfseries}%
% Defining 5-keywords
,keywordstyle=[4]{\color{Blue!70}\bfseries}%
% Defining 6-keywords
,keywordstyle=[5]{\itshape}%
%
}
\lstdefinestyle{displaybash}{style=bash,
basicstyle=\ttfamily\footnotesize,
backgroundcolor=\color{black!2}, frame=lines}%
\lstnewenvironment{bash}[1][]{\lstset{style=displaybash, #1}}{}
\def\inlinebash{\lstinline[style=bash, breaklines=true,columns=fullflexible]}
\usepackage[caption]{subfig}
\usepackage[size=footnotesize]{caption}
\usepackage{lstisadof-manual}
\providecommand{\isactrlemph}[1]{\emph{#1}}
\usepackage[LNCS,
orcidicon,
key=brucker.ea-isabelle-ontologies-2018,

View File

@ -0,0 +1,9 @@
chapter AFP
session "Isabelle_DOF-Example-II" (AFP) = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
theories
"paper"
document_files
"root.bib"
"preamble.tex"

View File

@ -1,6 +1,8 @@
%% This is a placeholder for user-specific configuration and packages.
\usepackage{stmaryrd}
\usepackage{pifont}% http://ctan.org/pkg/pifont
\title{<TITLE>}
\author{<AUTHOR>}

View File

@ -6870,7 +6870,7 @@ isbn="978-3-540-48509-4"
title = {{Isabelle's} Logic: {HOL}},
author = {Tobias Nipkow and Lawrence C. Paulson and Markus Wenzel},
year = 2009,
misc = {\url{http://isabelle.in.tum.de/library/HOL/}}
misc = {\url{https://isabelle.in.tum.de/library/HOL/}}
}
@InProceedings{ garson.ea:security:2008,
@ -11000,7 +11000,7 @@ isbn="978-1-4471-3182-3"
journal = {Archive of Formal Proofs},
month = apr,
year = 2019,
note = {\url{http://isa-afp.org/entries/HOL-CSP.html}},
note = {\url{https://isa-afp.org/entries/HOL-CSP.html}},
ISSN = {2150-914x},
}

View File

@ -3,6 +3,8 @@ theory "paper"
imports "Isabelle_DOF.scholarly_paper"
begin
use_template "scrartcl"
use_ontology "scholarly_paper"
open_monitor*[this::article]
@ -10,10 +12,13 @@ declare[[ strict_monitor_checking = false]]
declare[[ Definition_default_class = "definition"]]
declare[[ Lemma_default_class = "lemma"]]
declare[[ Theorem_default_class = "theorem"]]
declare[[ Corollary_default_class = "corollary"]]
define_shortcut* csp \<rightleftharpoons> \<open>CSP\<close>
holcsp \<rightleftharpoons> \<open>HOL-CSP\<close>
isabelle \<rightleftharpoons> \<open>Isabelle/HOL\<close>
hfill \<rightleftharpoons> \<open>\hfill\<close>
br \<rightleftharpoons> \<open>\break\<close>
(*>*)
@ -25,7 +30,7 @@ author*[lina,email="\<open>lina.ye@lri.fr\<close>",affiliation="\<open>LRI, Inri
abstract*[abs, keywordlist="[\<open>Shallow Embedding\<close>,\<open>Process-Algebra\<close>,
\<open>Concurrency\<close>,\<open>Computational Models\<close>]"]
\<open> The theory of Communicating Sequential Processes going back to Hoare and Roscoe is still today
\<open> The theory of Communicating Sequential Processes going back to Hoare and Roscoe is still today
one of the reference theories for concurrent specification and computing. In 1997, a first
formalization in \<^isabelle> of the denotational semantics of the Failure/Divergence Model of
\<^csp> was undertaken; in particular, this model can cope with infinite alphabets, in contrast
@ -49,33 +54,32 @@ abstract*[abs, keywordlist="[\<open>Shallow Embedding\<close>,\<open>Process-Alg
If you consider citing this paper, please refer to @{cite "HOL-CSP-iFM2020"}.
\<close>
text\<open>\<close>
section*[introheader::introduction,main_author="Some(@{docitem ''bu''}::author)"]\<open> Introduction \<close>
text*[introtext::introduction]\<open>
Communicating Sequential Processes (\<^csp>) is a language
to specify and verify patterns of interaction of concurrent systems.
Together with CCS and LOTOS, it belongs to the family of \<^emph>\<open>process algebras\<close>.
\<^csp>'s rich theory comprises denotational, operational and algebraic semantic facets
and has influenced programming languages such as Limbo, Crystal, Clojure and
most notably Golang @{cite "donovan2015go"}. \<^csp> has been applied in
industry as a tool for specifying and verifying the concurrent aspects of hardware
systems, such as the T9000 transansputer @{cite "Barret95"}.
section*[introheader::introduction,main_author="Some(@{author ''bu''}::author)"]\<open> Introduction \<close>
text*[introtext::introduction, level="Some 1"]\<open>
Communicating Sequential Processes (\<^csp>) is a language to specify and verify patterns of
interaction of concurrent systems. Together with CCS and LOTOS, it belongs to the family of
\<^emph>\<open>process algebras\<close>. \<^csp>'s rich theory comprises denotational, operational and algebraic semantic
facets and has influenced programming languages such as Limbo, Crystal, Clojure and most notably
Golang @{cite "donovan2015go"}. \<^csp> has been applied in industry as a tool for specifying and
verifying the concurrent aspects of hardware systems, such as the T9000 transansputer
@{cite "Barret95"}.
The theory of \<^csp> was first described in 1978 in a book by Tony Hoare @{cite "Hoare:1985:CSP:3921"},
but has since evolved substantially @{cite "BrookesHR84" and "brookes-roscoe85" and "roscoe:csp:1998"}.
\<^csp> describes the most common communication and synchronization mechanisms
with one single language primitive: synchronous communication written \<open>_\<lbrakk>_\<rbrakk>_\<close>. \<^csp> semantics is
described by a fully abstract model of behaviour designed to be \<^emph>\<open>compositional\<close>: the denotational
semantics of a process \<open>P\<close> encompasses all possible behaviours of this process in the context of all
possible environments \<open>P \<lbrakk>S\<rbrakk> Env\<close> (where \<open>S\<close> is the set of \<open>atomic events\<close> both \<open>P\<close> and \<open>Env\<close> must
synchronize). This design objective has the consequence that two kinds of choice have to
be distinguished:
\<^enum> the \<^emph>\<open>external choice\<close>, written \<open>_\<box>_\<close>, which forces a process "to follow" whatever
the environment offers, and
\<^enum> the \<^emph>\<open>internal choice\<close>, written \<open>_\<sqinter>_\<close>, which imposes on the environment of a process
"to follow" the non-deterministic choices made.
\<^csp> describes the most common communication and synchronization mechanisms with one single language
primitive: synchronous communication written \<open>_\<lbrakk>_\<rbrakk>_\<close>. \<^csp> semantics is described by a fully abstract
model of behaviour designed to be \<^emph>\<open>compositional\<close>: the denotational semantics of a process \<open>P\<close>
encompasses all possible behaviours of this process in the context of all possible environments
\<open>P \<lbrakk>S\<rbrakk> Env\<close> (where \<open>S\<close> is the set of \<open>atomic events\<close> both \<open>P\<close> and \<open>Env\<close> must synchronize). This
design objective has the consequence that two kinds of choice have to be distinguished: \<^vs>\<open>0.1cm\<close>
\<^enum> the \<^emph>\<open>external choice\<close>, written \<open>_\<box>_\<close>, which forces a process "to follow" whatever
the environment offers, and \<^vs>\<open>-0.4cm\<close>
\<^enum> the \<^emph>\<open>internal choice\<close>, written \<open>_\<sqinter>_\<close>, which imposes on the environment of a process
"to follow" the non-deterministic choices made.\<^vs>\<open>0.3cm\<close>
\<close>
text\<open>
text\<open> \<^vs>\<open>-0.6cm\<close>
Generalizations of these two operators \<open>\<box>x\<in>A. P(x)\<close> and \<open>\<Sqinter>x\<in>A. P(x)\<close> allow for modeling the concepts
of \<^emph>\<open>input\<close> and \<^emph>\<open>output\<close>: Based on the prefix operator \<open>a\<rightarrow>P\<close> (event \<open>a\<close> happens, then the process
proceeds with \<open>P\<close>), receiving input is modeled by \<open>\<box>x\<in>A. x\<rightarrow>P(x)\<close> while sending output is represented
@ -121,25 +125,11 @@ attempt to formalize denotational \<^csp> semantics covering a part of Bill Rosc
\<^url>\<open>https://gitlri.lri.fr/burkhart.wolff/hol-csp2.0\<close>. In this paper, all Isabelle proofs are
omitted.\<close>}.
\<close>
(*
% Moreover, decomposition rules of the form:
% \begin{center}
% \begin{minipage}[c]{10cm}
% @{cartouche [display] \<open>C \<Longrightarrow> A \<sqsubseteq>\<^sub>F\<^sub>D A' \<Longrightarrow> B \<sqsubseteq>\<^sub>F\<^sub>D B' \<Longrightarrow> A \<lbrakk>S\<rbrakk> B \<sqsubseteq>\<^sub>F\<^sub>D A' \<lbrakk>S\<rbrakk> B'\<close>}
% \end{minipage}
% \end{center}
% are of particular interest since they allow to avoid the costly automata-product construction
% of model-checkers and to separate infinite sub-systems from finite (model-checkable) ones; however,
% their side-conditions \<open>C\<close> are particularly tricky to work out. Decomposition rules may pave the
% way for future tool combinations for model-checkers such as FDR4~@{cite "fdr4"} or
% PAT~@{cite "SunLDP09"} based on proof certifications.*)
section*["pre"::tc,main_author="Some(@{docitem \<open>bu\<close>}::author)"]
section*["pre"::technical,main_author="Some(@{author \<open>bu\<close>}::author)"]
\<open>Preliminaries\<close>
text\<open>\<close>
subsection*[cspsemantics::tc, main_author="Some(@{docitem ''bu''})"]\<open>Denotational \<^csp> Semantics\<close>
subsection*[cspsemantics::technical, main_author="Some(@{author ''bu''})"]\<open>Denotational \<^csp> Semantics\<close>
text\<open> The denotational semantics (following @{cite "roscoe:csp:1998"}) comes in three layers:
the \<^emph>\<open>trace model\<close>, the \<^emph>\<open>(stable) failures model\<close> and the \<^emph>\<open>failure/divergence model\<close>.
@ -152,10 +142,10 @@ processes \<open>Skip\<close> (successful termination) and \<open>Stop\<close> (
\<open>\<T>(Skip) = \<T>(Stop) = {[]}\<close>.
Note that the trace sets, representing all \<^emph>\<open>partial\<close> history, is in general prefix closed.\<close>
text*[ex1::math_example, status=semiformal] \<open>
Let two processes be defined as follows:
text*[ex1::math_example, status=semiformal, level="Some 1"] \<open>
Let two processes be defined as follows:\<^vs>\<open>0.2cm\<close>
\<^enum> \<open>P\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<box> (b \<rightarrow> Stop)\<close>
\<^enum> \<open>P\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<box> (b \<rightarrow> Stop)\<close>
\<^enum> \<open>P\<^sub>n\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<sqinter> (b \<rightarrow> Stop)\<close>
\<close>
@ -181,7 +171,6 @@ The following process \<open>P\<^sub>i\<^sub>n\<^sub>f\<close> is an infinite pr
many times. However, using the \<^csp> hiding operator \<open>_\_\<close>, this activity is concealed:
\<^enum> \<open>P\<^sub>i\<^sub>n\<^sub>f = (\<mu> X. a \<rightarrow> X) \ {a}\<close>
\<close>
text\<open>where \<open>P\<^sub>i\<^sub>n\<^sub>f\<close> will be equivalent to \<open>\<bottom>\<close> in the process cpo ordering.
@ -200,7 +189,7 @@ of @{cite "IsobeRoggenbach2010"} is restricted to a variant of the failures mode
\<close>
subsection*["isabelleHol"::tc, main_author="Some(@{docitem ''bu''})"]\<open>Isabelle/HOL\<close>
subsection*["isabelleHol"::technical, main_author="Some(@{author ''bu''})"]\<open>Isabelle/HOL\<close>
text\<open> Nowadays, Isabelle/HOL is one of the major interactive theory development environments
@{cite "nipkow.ea:isabelle:2002"}. HOL stands for Higher-Order Logic, a logic based on simply-typed
\<open>\<lambda>\<close>-calculus extended by parametric polymorphism and Haskell-like type-classes.
@ -212,7 +201,6 @@ in the plethora of work done and has been a key factor for the success of the Ar
For the work presented here, one relevant construction is :
\<^item> \<^theory_text>\<open>typedef (\<alpha>\<^sub>1,...,\<alpha>\<^sub>n)t = E\<close>
It creates a fresh type that is isomorphic to a set \<open>E\<close> involving \<open>\<alpha>\<^sub>1,...,\<alpha>\<^sub>n\<close> types.
Isabelle/HOL performs a number of syntactic checks for these constructions that guarantee the logical
@ -223,25 +211,23 @@ distribution comes with rich libraries comprising Sets, Numbers, Lists, etc. whi
For this work, a particular library called \<^theory_text>\<open>HOLCF\<close> is intensively used. It provides classical
domain theory for a particular type-class \<open>\<alpha>::pcpo\<close>, \<^ie> the class of types \<open>\<alpha>\<close> for which
\<^enum> a least element \<open>\<bottom>\<close> is defined, and
\<^enum> a least element \<open>\<bottom>\<close> is defined, and
\<^enum> a complete partial order \<open>_\<sqsubseteq>_\<close> is defined.
For these types, \<^theory_text>\<open>HOLCF\<close> provides a fixed-point operator \<open>\<mu>X. f X\<close> as well as the
fixed-point induction and other (automated) proof infrastructure. Isabelle's type-inference can
automatically infer, for example, that if \<open>\<alpha>::pcpo\<close>, then \<open>(\<beta> \<Rightarrow> \<alpha>)::pcpo\<close>. \<close>
section*["csphol"::tc,main_author="Some(@{docitem ''bu''}::author)", level="Some 2"]
section*["csphol"::technical,main_author="Some(@{author ''bu''}::author)", level="Some 2"]
\<open>Formalising Denotational \<^csp> Semantics in HOL \<close>
text\<open>\<close>
subsection*["processinv"::tc, main_author="Some(@{docitem ''bu''})"]
subsection*["processinv"::technical, main_author="Some(@{author ''bu''})"]
\<open>Process Invariant and Process Type\<close>
text\<open> First, we need a slight revision of the concept
of \<^emph>\<open>trace\<close>: if \<open>\<Sigma>\<close> is the type of the atomic events (represented by a type variable), then
we need to extend this type by a special event \<open>\<surd>\<close> (called "tick") signaling termination.
Thus, traces have the type \<open>(\<Sigma>+\<surd>)\<^sup>*\<close>, written \<open>\<Sigma>\<^sup>\<surd>\<^sup>*\<close>; since \<open>\<surd>\<close> may only occur at the end of a trace,
we need to define a predicate \<open>front\<^sub>-tickFree t\<close> that requires from traces that \<open>\<surd>\<close> can only occur
we need to extend this type by a special event \<open>\<checkmark>\<close> (called "tick") signaling termination.
Thus, traces have the type \<open>(\<Sigma>\<uplus>\<checkmark>)\<^sup>*\<close>, written \<open>\<Sigma>\<^sup>\<checkmark>\<^sup>*\<close>; since \<open>\<checkmark>\<close> may only occur at the end of a trace,
we need to define a predicate \<open>front\<^sub>-tickFree t\<close> that requires from traces that \<open>\<checkmark>\<close> can only occur
at the end.
Second, in the traditional literature, the semantic domain is implicitly described by 9 "axioms"
@ -256,38 +242,37 @@ Informally, these are:
\<^item> the tick accepted after a trace \<open>s\<close> implies that all other events are refused;
\<^item> a divergence trace with any suffix is itself a divergence one
\<^item> once a process has diverged, it can engage in or refuse any sequence of events.
\<^item> a trace ending with \<open>\<surd>\<close> belonging to divergence set implies that its
maximum prefix without \<open>\<surd>\<close> is also a divergent trace.
\<^item> a trace ending with \<open>\<checkmark>\<close> belonging to divergence set implies that its
maximum prefix without \<open>\<checkmark>\<close> is also a divergent trace.
More formally, a process \<open>P\<close> of the type \<open>\<Sigma> process\<close> should have the following properties:
@{cartouche [display] \<open>([],{}) \<in> \<F> P \<and>
@{cartouche [display, indent=10] \<open>([],{}) \<in> \<F> P \<and>
(\<forall> s X. (s,X) \<in> \<F> P \<longrightarrow> front_tickFree s) \<and>
(\<forall> s t . (s@t,{}) \<in> \<F> P \<longrightarrow> (s,{}) \<in> \<F> P) \<and>
(\<forall> s X Y. (s,Y) \<in> \<F> P \<and> X\<subseteq>Y \<longrightarrow> (s,X) \<in> \<F> P) \<and>
(\<forall> s X Y. (s,X) \<in> \<F> P \<and> (\<forall>c \<in> Y. ((s@[c],{}) \<notin> \<F> P)) \<longrightarrow> (s,X \<union> Y) \<in> \<F> P) \<and>
(\<forall> s X. (s@[\<surd>],{}) \<in> \<F> P \<longrightarrow> (s,X-{\<surd>}) \<in> \<F> P) \<and>
(\<forall> s X. (s@[\<checkmark>],{}) \<in> \<F> P \<longrightarrow> (s,X-{\<checkmark>}) \<in> \<F> P) \<and>
(\<forall> s t. s \<in> \<D> P \<and> tickFree s \<and> front_tickFree t \<longrightarrow> s@t \<in> \<D> P) \<and>
(\<forall> s X. s \<in> \<D> P \<longrightarrow> (s,X) \<in> \<F> P) \<and>
(\<forall> s. s@[\<surd>] \<in> \<D> P \<longrightarrow> s \<in> \<D> P)\<close>}
(\<forall> s. s@[\<checkmark>] \<in> \<D> P \<longrightarrow> s \<in> \<D> P)\<close>}
Our objective is to encapsulate this wishlist into a type constructed as a conservative
theory extension in our theory \<^holcsp>.
Therefore third, we define a pre-type for processes \<open>\<Sigma> process\<^sub>0\<close> by \<open> \<P>(\<Sigma>\<^sup>\<surd>\<^sup>* \<times> \<P>(\<Sigma>\<^sup>\<surd>)) \<times> \<P>(\<Sigma>\<^sup>\<surd>)\<close>.
Therefore third, we define a pre-type for processes \<open>\<Sigma> process\<^sub>0\<close> by \<open> \<P>(\<Sigma>\<^sup>\<checkmark>\<^sup>* \<times> \<P>(\<Sigma>\<^sup>\<checkmark>)) \<times> \<P>(\<Sigma>\<^sup>\<checkmark>)\<close>.
Forth, we turn our wishlist of "axioms" above into the definition of a predicate \<open>is_process P\<close>
of type \<open>\<Sigma> process\<^sub>0 \<Rightarrow> bool\<close> deciding if its conditions are fulfilled. Since \<open>P\<close> is a pre-process,
we replace \<open>\<F>\<close> by \<open>fst\<close> and \<open>\<D>\<close> by \<open>snd\<close> (the HOL projections into a pair).
And last not least fifth, we use the following type definition:
\<^item> \<^theory_text>\<open>typedef '\<alpha> process = "{P :: '\<alpha> process\<^sub>0 . is_process P}"\<close>
\<^item> \<^theory_text>\<open>typedef '\<alpha> process = "{P :: '\<alpha> process\<^sub>0 . is_process P}"\<close>
Isabelle requires a proof for the existence of a witness for this set,
but this can be constructed in a straight-forward manner. Suitable definitions for
\<open>\<T>\<close>, \<open>\<F>\<close> and \<open>\<D>\<close> lifting \<open>fst\<close> and \<open>snd\<close> on the new \<open>'\<alpha> process\<close>-type allows to derive
the above properties for any \<open>P::'\<alpha> process\<close>. \<close>
subsection*["operator"::tc, main_author="Some(@{docitem ''lina''})"]
subsection*["operator"::technical, main_author="Some(@{author ''lina''})"]
\<open>\<^csp> Operators over the Process Type\<close>
text\<open> Now, the operators of \<^csp> \<open>Skip\<close>, \<open>Stop\<close>, \<open>_\<sqinter>_\<close>, \<open>_\<box>_\<close>, \<open>_\<rightarrow>_\<close>,\<open>_\<lbrakk>_\<rbrakk>_\<close> etc.
for internal choice, external choice, prefix and parallel composition, can
@ -301,17 +286,18 @@ For example, we define \<open>_\<sqinter>_\<close> on the pre-process type as fo
\<^item> \<^theory_text>\<open>definition "P \<sqinter> Q \<equiv> Abs_process(\<F> P \<union> \<F> Q , \<D> P \<union> \<D> Q)"\<close>
where \<open>\<F> = fst \<circ> Rep_process\<close> and \<open>\<D> = snd \<circ> Rep_process\<close> and where \<open>Rep_process\<close> and
\<open>Abs_process\<close> are the representation and abstraction morphisms resulting from the
type definition linking \<open>'\<alpha> process\<close> isomorphically to \<open>'\<alpha> process\<^sub>0\<close>. Proving the above properties
for \<open>\<F> (P \<sqinter> Q)\<close> and \<open>\<D> (P \<sqinter> Q)\<close> requires a proof that \<open>(\<F> P \<union> \<F> Q , \<D> P \<union> \<D> Q)\<close>
satisfies the 9 "axioms", which is fairly simple in this case.
where \<open>Rep_process\<close> and \<open>Abs_process\<close> are the representation and abstraction morphisms resulting
from the type definition linking the type \<open>'\<alpha> process\<close> isomorphically to the set \<open>'\<alpha> process\<^sub>0\<close>.
The projection into \<^emph>\<open>failures\<close> is defined by \<open>\<F> = fst \<circ> Rep_process\<close>, whereas the
\<^emph>\<open>divergences\<close> are defined bz \<open>\<D> = snd \<circ> Rep_process\<close>. Proving the above properties for
\<open>\<F> (P \<sqinter> Q)\<close> and \<open>\<D> (P \<sqinter> Q)\<close> requires a proof that \<open>(\<F> P \<union> \<F> Q , \<D> P \<union> \<D> Q)\<close>
satisfies the well-formedness conditions of \<open>is_process\<close>, which is fairly simple in this case.
The definitional presentation of the \<^csp> process operators according to @{cite "roscoe:csp:1998"}
follows always this scheme. This part of the theory comprises around 2000 loc.
\<close>
subsection*["orderings"::tc, main_author="Some(@{docitem ''bu''})"]
subsection*["orderings"::technical, main_author="Some(@{author ''bu''})"]
\<open>Refinement Orderings\<close>
text\<open> \<^csp> is centered around the idea of process refinement; many critical properties,
@ -320,15 +306,16 @@ a conversion of processes in terms of (finite) labelled transition systems leads
model-checking techniques based on graph-exploration. Essentially, a process \<open>P\<close> \<^emph>\<open>refines\<close>
another process \<open>Q\<close> if and only if it is more deterministic and more defined (has less divergences).
Consequently, each of the three semantics models (trace, failure and failure/divergence)
has its corresponding refinement orderings.
has its corresponding refinement orderings.\<close>
Theorem*[th1::"theorem", short_name="\<open>Refinement properties\<close>"]\<open>
What we are interested in this paper is the following refinement orderings for the
failure/divergence model.
\<^enum> \<open>P \<sqsubseteq>\<^sub>\<F>\<^sub>\<D> Q \<equiv> \<F> P \<supseteq> \<F> Q \<and> \<D> P \<supseteq> \<D> Q\<close>
\<^enum> \<open>P \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> Q \<equiv> \<T> P \<supseteq> \<T> Q \<and> \<D> P \<supseteq> \<D> Q\<close>
\<^enum> \<open>P \<sqsubseteq>\<^sub>\<FF> Q \<equiv> \<FF> P \<supseteq> \<FF> Q, \<FF>\<in>{\<T>,\<F>,\<D>}\<close>
\<^enum> \<open>P \<sqsubseteq>\<^sub>\<FF> Q \<equiv> \<FF> P \<supseteq> \<FF> Q, \<FF>\<in>{\<T>,\<F>,\<D>}\<close> \<close>
Notice that in the \<^csp> literature, only \<open>\<sqsubseteq>\<^sub>\<F>\<^sub>\<D>\<close> is well studied for failure/divergence model.
text\<open> Notice that in the \<^csp> literature, only \<open>\<sqsubseteq>\<^sub>\<F>\<^sub>\<D>\<close> is well studied for failure/divergence model.
Our formal analysis of different granularities on the refinement orderings
allows deeper understanding of the same semantics model. For example, \<open>\<sqsubseteq>\<^sub>\<T>\<^sub>\<D>\<close> turns
out to have in some cases better monotonicity properties and therefore allow for stronger proof
@ -340,7 +327,7 @@ states, from which no internal progress is possible.
\<close>
subsection*["fixpoint"::tc, main_author="Some(@{docitem ''lina''})"]
subsection*["fixpoint"::technical, main_author="Some(@{author ''lina''})"]
\<open>Process Ordering and HOLCF\<close>
text\<open> For any denotational semantics, the fixed point theory giving semantics to systems
of recursive equations is considered as keystone. Its prerequisite is a complete partial ordering
@ -352,17 +339,16 @@ Roscoe and Brooks @{cite "Roscoe1992AnAO"} finally proposed another ordering, ca
that completeness could at least be assured for read-operations. This more complex ordering
is based on the concept \<^emph>\<open>refusals after\<close> a trace \<open>s\<close> and defined by \<open>\<R> P s \<equiv> {X | (s, X) \<in> \<F> P}\<close>.\<close>
Definition*[process_ordering, short_name="''process ordering''"]\<open>
Definition*[process_ordering, level= "Some 2", short_name="''process ordering''"]\<open>
We define \<open>P \<sqsubseteq> Q \<equiv> \<psi>\<^sub>\<D> \<and> \<psi>\<^sub>\<R> \<and> \<psi>\<^sub>\<M> \<close>, where
\<^enum> \<open>\<psi>\<^sub>\<D> = \<D> P \<supseteq> \<D> Q \<close>
\<^enum> \<open>\<psi>\<^sub>\<D> = \<D> P \<supseteq> \<D> Q \<close>
\<^enum> \<open>\<psi>\<^sub>\<R> = s \<notin> \<D> P \<Rightarrow> \<R> P s = \<R> Q s\<close>
\<^enum> \<open>\<psi>\<^sub>\<M> = Mins(\<D> P) \<subseteq> \<T> Q \<close>
\<close>
\<^enum> \<open>\<psi>\<^sub>\<M> = Mins(\<D> P) \<subseteq> \<T> Q \<close> \<close>
text\<open>The third condition \<open>\<psi>\<^sub>\<M>\<close> implies that the set of minimal divergent traces
(ones with no proper prefix that is also a divergence) in \<open>P\<close>, denoted by \<open>Mins(\<D> P)\<close>,
should be a subset of the trace set of \<open>Q\<close>.
%One may note that each element in \<open>Mins(\<D> P)\<close> do actually not contain the \<open>\<surd>\<close>,
%One may note that each element in \<open>Mins(\<D> P)\<close> do actually not contain the \<open>\<checkmark>\<close>,
%which can be deduced from the process invariants described
%in the precedent @{technical "processinv"}. This can be explained by the fact that we are not
%really concerned with what a process does after it terminates.
@ -393,44 +379,45 @@ For most \<^csp> operators \<open>\<otimes>\<close> we derived rules of the form
These rules allow to automatically infer for any process term if it is continuous or not.
The port of HOL-CSP 2 on HOLCF implied that the derivation of the entire continuity rules
had to be completely re-done (3000 loc).
HOL-CSP provides an important proof principle, the fixed-point induction:
had to be completely re-done (3000 loc).\<close>
Theorem*[th2,short_name="\<open>Fixpoint Induction\<close>"]
\<open>HOL-CSP provides an important proof principle, the fixed-point induction:
@{cartouche [display, indent=5] \<open>cont f \<Longrightarrow> adm P \<Longrightarrow> P \<bottom> \<Longrightarrow> (\<And>X. P X \<Longrightarrow> P(f X)) \<Longrightarrow> P(\<mu>X. f X)\<close>}
\<close>
Fixed-point induction requires a small side-calculus for establishing the admissibility
text\<open>Fixed-point induction of @{theorem th2} requires a small side-calculus for establishing the admissibility
of a predicate; basically, predicates are admissible if they are valid for any least upper bound
of a chain \<open>x\<^sub>1 \<sqsubseteq> x\<^sub>2 \<sqsubseteq> x\<^sub>3 ... \<close> provided that \<open>\<forall>i. P(x\<^sub>i)\<close>. It turns out that \<open>_\<sqsubseteq>_\<close> and \<open>_\<sqsubseteq>\<^sub>F\<^sub>D_\<close> as
well as all other refinement orderings that we introduce in this paper are admissible.
Fixed-point inductions are the main proof weapon in verifications,
together with monotonicities and the \<^csp> laws. Denotational arguments can be hidden as they are not
needed in practical verifications. \<close>
Fixed-point inductions are the main proof weapon in verifications, together with monotonicities
and the \<^csp> laws. Denotational arguments can be hidden as they are not needed in practical
verifications. \<close>
subsection*["law"::tc, main_author="Some(@{docitem ''lina''})"]
subsection*["law"::technical, main_author="Some(@{author ''lina''})"]
\<open>\<^csp> Rules: Improved Proofs and New Results\<close>
text\<open> The \<^csp> operators enjoy a number of algebraic properties: commutativity,
text\<open>The \<^csp> operators enjoy a number of algebraic properties: commutativity,
associativities, and idempotence in some cases. Moreover, there is a rich body of distribution
laws between these operators. Our new version HOL-CSP 2 not only shortens and restructures the
proofs of @{cite "tej.ea:corrected:1997"}; the code reduces
to 8000 loc from 25000 loc. Some illustrative examples of new established rules are:
proofs of @{cite "tej.ea:corrected:1997"}; the code reduces to 8000 loc from 25000 loc. \<close>
Theorem*[th3, short_name="\<open>Examples of Derived Rules.\<close>"]\<open>
\<^item> \<open>\<box>x\<in>A\<union>B\<rightarrow>P(x) = (\<box>x\<in>A\<rightarrow>P x) \<box> (\<box>x\<in>B\<rightarrow>P x)\<close>
\<^item> \<open>A\<union>B\<subseteq>C \<Longrightarrow> (\<box>x\<in>A\<rightarrow>P x \<lbrakk>C\<rbrakk> \<box>x\<in>B\<rightarrow>Q x) = \<box>x\<in>A\<inter>B\<rightarrow>(P x \<lbrakk>C\<rbrakk> Q x)\<close>
\<^item> @{cartouche [display]\<open>A\<subseteq>C \<Longrightarrow> B\<inter>C={} \<Longrightarrow>
(\<box>x\<in>A\<rightarrow>P x \<lbrakk>C\<rbrakk> \<box>x\<in>B\<rightarrow>Q x) = \<box>x\<in>B\<rightarrow>(\<box>x\<in>A\<rightarrow>P x \<lbrakk>C\<rbrakk> Q x)\<close>}
\<^item> \<open>finite A \<Longrightarrow> A\<inter>C = {} \<Longrightarrow> ((P \<lbrakk>C\<rbrakk> Q) \ A) = ((P \ A) \<lbrakk>C\<rbrakk> (Q \ A)) ...\<close>
\<^item> \<open>finite A \<Longrightarrow> A\<inter>C = {} \<Longrightarrow> ((P \<lbrakk>C\<rbrakk> Q) \ A) = ((P \ A) \<lbrakk>C\<rbrakk> (Q \ A)) ...\<close>\<close>
The continuity proof of the hiding operator is notorious. The proof is known
to involve the classical König's lemma stating that every infinite tree with finite branching
has an infinite path. We adapt this lemma to our context as follows:
@{cartouche [display, indent=5]
text\<open>The continuity proof of the hiding operator is notorious. The proof is known to involve the
classical König's lemma stating that every infinite tree with finite branching has an infinite path.
We adapt this lemma to our context as follows:
@{cartouche [display, indent=5]
\<open>infinite tr \<Longrightarrow> \<forall>i. finite{t. \<exists>t'\<in>tr. t = take i t'}
\<Longrightarrow> \<exists> f. strict_mono f \<and> range f \<subseteq> {t. \<exists>t'\<in>tr. t \<le> t'}\<close>}
\<Longrightarrow> \<exists> f. strict_mono f \<and> range f \<subseteq> {t. \<exists>t'\<in>tr. t \<le> t'}\<close>}
in order to come up with the continuity rule: \<open>finite S \<Longrightarrow> cont P \<Longrightarrow> cont(\<lambda>X. P X \ S)\<close>.
The original proof had been drastically shortened by a factor 10 and important immediate steps
@ -449,12 +436,12 @@ cases to be considered as well as their complexity makes pen and paper proofs
practically infeasible.
\<close>
section*["newResults"::tc,main_author="Some(@{docitem ''safouan''}::author)",
main_author="Some(@{docitem ''lina''}::author)", level= "Some 3"]
section*["newResults"::technical,main_author="Some(@{author ''safouan''}::author)",
main_author="Some(@{author ''lina''}::author)", level= "Some 3"]
\<open>Theoretical Results on Refinement\<close>
text\<open>\<close>
subsection*["adm"::tc,main_author="Some(@{docitem ''safouan''}::author)",
main_author="Some(@{docitem ''lina''}::author)"]
subsection*["adm"::technical,main_author="Some(@{author ''safouan''}::author)",
main_author="Some(@{author ''lina''}::author)"]
\<open>Decomposition Rules\<close>
text\<open>
In our framework, we implemented the pcpo process refinement together with the five refinement
@ -474,47 +461,23 @@ under all refinement orderings, while others are not.
\<^item> Sequence operator is not monotonic under \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> or \<open>\<sqsubseteq>\<^sub>\<T>\<close>:
@{cartouche [display,indent=5]
\<open>P \<sqsubseteq>\<^sub>\<FF> P'\<Longrightarrow> Q \<sqsubseteq>\<^sub>\<FF> Q' \<Longrightarrow> (P ; Q) \<sqsubseteq>\<^sub>\<FF> (P' ; Q') where \<FF>\<in>{\<T>\<D>,\<F>\<D>}\<close>}
%All refinements are right-side monotonic but \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close> are not left-side monotonic,
%which can be explained by
%the interdependence relationship of failure and divergence projections for the first component.
%We thus proved:
All refinements are right-side monotonic but \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close> are not left-side monotonic,
which can be explained by the interdependence relationship of failure and divergence projections
for the first component. We thus proved:
\<^item> Hiding operator is not monotonic under \<open>\<sqsubseteq>\<^sub>\<D>\<close>:
@{cartouche [display,indent=5] \<open>P \<sqsubseteq>\<^sub>\<FF> Q \<Longrightarrow> P \ A \<sqsubseteq>\<^sub>\<FF> Q \ A where \<FF>\<in>{\<T>,\<F>,\<T>\<D>,\<F>\<D>}\<close>}
%Intuitively, for the divergence refinement of the hiding operator, there may be
%some trace \<open>s\<in>\<T> Q\<close> and \<open>s\<notin>\<T> P\<close> such that it becomes divergent in \<open>Q \ A\<close> but
%not in \<open>P \ A\<close>.
%when the condition in the corresponding projection laws is satisfied, which makes it is not monotonic.
Intuitively, for the divergence refinement of the hiding operator, there may be
some trace \<open>s\<in>\<T> Q\<close> and \<open>s\<notin>\<T> P\<close> such that it becomes divergent in \<open>Q \ A\<close> but
not in \<open>P \ A\<close>.
\<^item> Parallel composition is not monotonic under \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> or \<open>\<sqsubseteq>\<^sub>\<T>\<close>:
@{cartouche [display,indent=5] \<open>P \<sqsubseteq>\<^sub>\<FF> P' \<Longrightarrow> Q \<sqsubseteq>\<^sub>\<FF> Q' \<Longrightarrow> (P \<lbrakk>A\<rbrakk> Q) \<sqsubseteq>\<^sub>\<FF> (P' \<lbrakk>A\<rbrakk> Q') where \<FF>\<in>{\<T>\<D>,\<F>\<D>}\<close>}
%The failure and divergence projections of this operator are also interdependent, similar to the
%sequence operator.
%Hence, this operator is not monotonic with \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close>, but monotonic when their
%combinations are considered.
The failure and divergence projections of this operator are also interdependent, similar to the
sequence operator. Hence, this operator is not monotonic with \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close>, but monotonic
when their combinations are considered. \<close>
\<close>
(* Besides the monotonicity results on the above \<^csp> operators,
we have also proved that for other \<^csp> operators, such as multi-prefix and non-deterministic choice,
they are all monotonic with these five refinement orderings. Such theoretical results provide significant indicators
for semantics choices when considering specification decomposition.
We want to emphasize that this is the first work on such substantial
analysis in a formal way, as far as we know.
%In the literature, these processes are defined in a way that does not distinguish the special event \<open>tick\<close>. To be consistent with the idea that ticks should be distinguished on the semantic level, besides the above
three processes,
one can directly prove 3 since for both \<open>CHAOS\<close> and \<open>DF\<close>,
the version with \<open>SKIP\<close> is constructed exactly in the same way from that without \<open>SKIP\<close>.
And 4 is obtained based on the projection laws of internal choice \<open>\<sqinter>\<close>.
Finally, for 5, the difference between \<open>DF\<close> and \<open>RUN\<close> is that the former applies internal choice
while the latter with external choice. From the projection laws of both operators,
the failure set of \<open>RUN\<close> has more constraints, thus being a subset of that of \<open>DF\<close>,
when the divergence set is empty, which is true for both processes.
*)
subsection*["processes"::tc,main_author="Some(@{docitem ''safouan''}::author)",
main_author="Some(@{docitem ''lina''}::author)"]
subsection*["processes"::technical,main_author="Some(@{author ''safouan''}::author)",
main_author="Some(@{author ''lina''}::author)"]
\<open>Reference Processes and their Properties\<close>
text\<open>
We now present reference processes that exhibit basic behaviors, introduced in
@ -528,10 +491,10 @@ To handle termination better, we added two new processes \<open>CHAOS\<^sub>S\<^
\<close>
(*<*) (* a test ...*)
text*[X22 ::math_content ]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
text*[X32::"definition", mcc=defn]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
Definition*[X42]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
Definition*[X52::"definition"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
text*[X22 ::math_content, level="Some 2" ]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
text*[X32::"definition", level="Some 2", mcc=defn]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
Definition*[X42, level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
Definition*[X52::"definition", level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
text\<open> The \<open>RUN\<close>-process defined @{math_content X22} represents the process that accepts all
events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in two variants shown in
@ -539,51 +502,48 @@ events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in
stops or accepts any offered event, whereas \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> can additionally terminate.\<close>
(*>*)
Definition*[X2]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
Definition*[X3]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
Definition*[X4]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close>\<close>
Definition*[X5]\<open>\<open>DF A \<equiv> \<mu> X. (\<sqinter> x \<in> A \<rightarrow> X)\<close> \<close>
Definition*[X6]\<open>\<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. ((\<sqinter> x \<in> A \<rightarrow> X) \<sqinter> SKIP)\<close> \<close>
Definition*[X2, level="Some 2"]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
Definition*[X3, level="Some 2"]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
Definition*[X4, level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close>\<close>
Definition*[X5, level="Some 2"]\<open>\<open>DF A \<equiv> \<mu> X. (\<sqinter> x \<in> A \<rightarrow> X)\<close> \<close>
Definition*[X6, level="Some 2"]\<open>\<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. ((\<sqinter> x \<in> A \<rightarrow> X) \<sqinter> SKIP)\<close> \<close>
text\<open>In the following, we denote \<open> \<R>\<P> = {DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P, DF, RUN, CHAOS, CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P}\<close>.
All five reference processes are divergence-free.
%which was done by using a particular lemma \<open>\<D> (\<mu> x. f x) = \<Inter>\<^sub>i\<^sub>\<in>\<^sub>\<nat> \<D> (f\<^sup>i \<bottom>)\<close>.
which was proven by using a particular lemma \<open>\<D> (\<mu> x. f x) = \<Inter>\<^sub>i\<^sub>\<in>\<^sub>\<nat> \<D> (f\<^sup>i \<bottom>)\<close>.
@{cartouche
[display,indent=8] \<open> D (\<PP> UNIV) = {} where \<PP> \<in> \<R>\<P> and UNIV is the set of all events\<close>
}
Regarding the failure refinement ordering, the set of failures \<open>\<F> P\<close> for any process \<open>P\<close> is
a subset of \<open>\<F> (CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV)\<close>.% and the following lemma was proved:
% This proof is performed by induction, based on the failure projection of \<open>STOP\<close> and that of
% internal choice.
a subset of \<open>\<F> (CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV)\<close>.
@{cartouche [display, indent=25] \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close>}
\<^noindent> Furthermore, the following 5 relationships were demonstrated from monotonicity results and
a denotational proof.
%among which 1 and 2 are immediate corollaries,
%4 and 5 are directly obtained from our monotonicity results while 3 requires a denotational proof.
and thanks to transitivity, we can derive other relationships.
Furthermore, the following 5 relationships were demonstrated from monotonicity results and
a denotational proof.
\<close>
Corollary*[co1::"corollary", short_name="\<open>Corollaries on reference processes.\<close>",level="Some 2"]
\<open> \<^hfill> \<^br> \<^vs>\<open>-0.3cm\<close>
\<^enum> \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<sqsubseteq>\<^sub>\<F> CHAOS A\<close>
\<^enum> \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<sqsubseteq>\<^sub>\<F> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A\<close>
\<^enum> \<open>CHAOS A \<sqsubseteq>\<^sub>\<F> DF A\<close>
\<^enum> \<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<sqsubseteq>\<^sub>\<F> DF A\<close>
\<^enum> \<open>DF A \<sqsubseteq>\<^sub>\<F> RUN A\<close>
\<^enum> \<open>DF A \<sqsubseteq>\<^sub>\<F> RUN A\<close> \<^vs>\<open>0.3cm\<close>
where 1 and 2 are immediate, and where 4 and 5 are directly obtained from our monotonicity
results while 3 requires an argument over the denotational space.
Thanks to transitivity, we can derive other relationships.\<close>
Last, regarding trace refinement, for any process P,
text\<open> Lastly, regarding trace refinement, for any process P,
its set of traces \<open>\<T> P\<close> is a subset of \<open>\<T> (CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV)\<close> and of \<open>\<T> (DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV)\<close> as well.
%As we already proved that \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> covers all failures,
%we can immediately infer that it also covers all traces.
%The \<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> case requires a longer denotational proof.
\<^enum> \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T> P\<close>
\<^enum> \<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T> P\<close>
\<close>
text\<open>
@ -596,39 +556,34 @@ verification. For example, if one wants to establish that a protocol implementat
a non-deterministic specification \<open>SPEC\<close> it suffices to ask if \<open>IMPL || SPEC\<close> is deadlock-free.
In this setting, \<open>SPEC\<close> becomes a kind of observer that signals non-conformance of \<open>IMPL\<close> by
deadlock.
% A livelocked system looks similar to a deadlocked one from an external point of view.
% However, livelock is sometimes considered as worse since the user may be able to observe the internal
% activities and so hope that some output will happen eventually.
In the literature, deadlock and lifelock are phenomena that are often
handled separately. One contribution of our work is establish their precise relationship inside
the Failure/Divergence Semantics of \<^csp>.\<close>
(* bizarre: Definition* does not work for this single case *)
text*[X10::"definition"]\<open> \<open>deadlock\<^sub>-free P \<equiv> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close> \<close>
Definition*[X10::"definition", level="Some 2"]\<open> \<open>deadlock\<^sub>-free P \<equiv> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close> \<close>
text\<open>\<^noindent> A process \<open>P\<close> is deadlock-free if and only if after any trace \<open>s\<close> without \<open>\<surd>\<close>, the union of \<open>\<surd>\<close>
text\<open>\<^noindent> A process \<open>P\<close> is deadlock-free if and only if after any trace \<open>s\<close> without \<open>\<checkmark>\<close>, the union of \<open>\<checkmark>\<close>
and all events of \<open>P\<close> can never be a refusal set associated to \<open>s\<close>, which means that \<open>P\<close> cannot
be deadlocked after any non-terminating trace.
\<close>
Theorem*[T1, short_name="\<open>DF definition captures deadlock-freeness\<close>"]
\<open> \hfill \break \<open>deadlock_free P \<longleftrightarrow> (\<forall>s\<in>\<T> P. tickFree s \<longrightarrow> (s, {\<surd>}\<union>events_of P) \<notin> \<F> P)\<close> \<close>
Definition*[X11]\<open> \<open>livelock\<^sub>-free P \<equiv> \<D> P = {} \<close> \<close>
Theorem*[T1, short_name="\<open>DF definition captures deadlock-freeness\<close>", level="Some 2"]
\<open> \<^hfill> \<^br> \<open>deadlock_free P \<longleftrightarrow> (\<forall>s\<in>\<T> P. tickFree s \<longrightarrow> (s, {\<checkmark>}\<union>events_of P) \<notin> \<F> P)\<close> \<close>
Definition*[X11, level="Some 2"]\<open> \<open>livelock\<^sub>-free P \<equiv> \<D> P = {} \<close> \<close>
text\<open> Recall that all five reference processes are livelock-free.
We also have the following lemmas about the
livelock-freeness of processes:
\<^enum> \<open>livelock\<^sub>-free P \<longleftrightarrow> \<PP> UNIV \<sqsubseteq>\<^sub>\<D> P where \<PP> \<in> \<R>\<P>\<close>
\<^enum> @{cartouche [display]\<open>livelock\<^sub>-free P \<longleftrightarrow> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> P
\<longleftrightarrow> CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> P\<close>}
\<^enum> \<open>livelock\<^sub>-free P \<longleftrightarrow> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> P \<longleftrightarrow> CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> P\<close>
\<^enum> \<open>livelock\<^sub>-free P \<longleftrightarrow> CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F>\<^sub>\<D> P\<close>
\<close>
text\<open>
Finally, we proved the following theorem that confirms the relationship between the two vital
properties:
\<close>
Theorem*[T2, short_name="''DF implies LF''"]
Theorem*[T2, short_name="''DF implies LF''", level="Some 2"]
\<open> \<open>deadlock_free P \<longrightarrow> livelock_free P\<close> \<close>
text\<open>
@ -642,11 +597,11 @@ then it may still be livelock-free. % This makes sense since livelocks are worse
\<close>
section*["advanced"::tc,main_author="Some(@{docitem ''safouan''}::author)",level="Some 3"]
section*["advanced"::technical,main_author="Some(@{author ''safouan''}::author)",level="Some 3"]
\<open>Advanced Verification Techniques\<close>
text\<open>
Based on the refinement framework discussed in @{docitem "newResults"}, we will now
Based on the refinement framework discussed in @{technical "newResults"}, we will now
turn to some more advanced proof principles, tactics and verification techniques.
We will demonstrate them on two paradigmatic examples well-known in the \<^csp> literature:
The CopyBuffer and Dijkstra's Dining Philosophers. In both cases, we will exploit
@ -657,7 +612,7 @@ verification. In the latter case, we present an approach to a verification of a
architecture, in this case a ring-structure of arbitrary size.
\<close>
subsection*["illustration"::tc,main_author="Some(@{docitem ''safouan''}::author)", level="Some 3"]
subsection*["illustration"::technical,main_author="Some(@{author ''safouan''}::author)", level="Some 3"]
\<open>The General CopyBuffer Example\<close>
text\<open>
We consider the paradigmatic copy buffer example @{cite "Hoare:1985:CSP:3921" and "Roscoe:UCS:2010"}
@ -705,7 +660,7 @@ of 2 lines proof-script involving the derived algebraic laws of \<^csp>.
After proving that \<open>SYSTEM\<close> implements \<open>COPY\<close> for arbitrary alphabets, we aim to profit from this
first established result to check which relations \<open>SYSTEM\<close> has wrt. to the reference processes of
@{docitem "processes"}. Thus, we prove that \<open>COPY\<close> is deadlock-free which implies livelock-free,
@{technical "processes"}. Thus, we prove that \<open>COPY\<close> is deadlock-free which implies livelock-free,
(proof by fixed-induction similar to \<open>lemma: COPY \<sqsubseteq> SYSTEM\<close>), from which we can immediately infer
from transitivity that \<open>SYSTEM\<close> is. Using refinement relations, we killed four birds with one stone
as we proved the deadlock-freeness and the livelock-freeness for both \<open>COPY\<close> and \<open>SYSTEM\<close> processes.
@ -722,7 +677,7 @@ corollary deadlock_free COPY
\<close>
subsection*["inductions"::tc,main_author="Some(@{docitem ''safouan''}::author)"]
subsection*["inductions"::technical,main_author="Some(@{author ''safouan''}::author)"]
\<open>New Fixed-Point Inductions\<close>
text\<open>
@ -739,9 +694,8 @@ For this reason, we derived a number of alternative induction schemes (which are
in the HOLCF library), which are also relevant for our final Dining Philophers example.
These are essentially adaptions of k-induction schemes applied to domain-theoretic
setting (so: requiring \<open>f\<close> continuous and \<open>P\<close> admissible; these preconditions are
skipped here):
\<^item> @{cartouche [display]\<open>... \<Longrightarrow> \<forall>i<k. P (f\<^sup>i \<bottom>) \<Longrightarrow> (\<forall>X. (\<forall>i<k. P (f\<^sup>i X)) \<longrightarrow> P (f\<^sup>k X))
\<Longrightarrow> P (\<mu>X. f X)\<close>}
skipped here):\<^vs>\<open>0.2cm\<close>
\<^item> \<open>... \<Longrightarrow> \<forall>i<k. P (f\<^sup>i \<bottom>) \<Longrightarrow> (\<forall>X. (\<forall>i<k. P (f\<^sup>i X)) \<longrightarrow> P (f\<^sup>k X)) \<Longrightarrow> P (\<mu>X. f X)\<close>
\<^item> \<open>... \<Longrightarrow> \<forall>i<k. P (f\<^sup>i \<bottom>) \<Longrightarrow> (\<forall>X. P X \<longrightarrow> P (f\<^sup>k X)) \<Longrightarrow> P (\<mu>X. f X)\<close>
@ -749,10 +703,9 @@ skipped here):
it reduces the goal size.
Another problem occasionally occurring in refinement proofs happens when the right side term
involves more than one fixed-point process (\<^eg> \<open>P \<lbrakk>{A}\<rbrakk> Q \<sqsubseteq> S\<close>). In this situation,
involves more than one fixed-point process (\<^eg> \<open>P \<lbrakk>A\<rbrakk> Q \<sqsubseteq> S\<close>). In this situation,
we need parallel fixed-point inductions. The HOLCF library offers only a basic one:
\<^item> @{cartouche [display]\<open>... \<Longrightarrow> P \<bottom> \<bottom> \<Longrightarrow> (\<forall>X Y. P X Y \<Longrightarrow> P (f X) (g Y))
\<Longrightarrow> P (\<mu>X. f X) (\<mu>X. g X)\<close>}
\<^item> \<open>... \<Longrightarrow> P \<bottom> \<bottom> \<Longrightarrow> (\<forall>X Y. P X Y \<Longrightarrow> P (f X) (g Y)) \<Longrightarrow> P (\<mu>X. f X) (\<mu>X. g X)\<close>
\<^noindent> This form does not help in cases like in \<open>P \<lbrakk>\<emptyset>\<rbrakk> Q \<sqsubseteq> S\<close> with the interleaving operator on the
@ -774,7 +727,7 @@ The astute reader may notice here that if the induction step is weakened (having
the base steps require enforcement.
\<close>
subsection*["norm"::tc,main_author="Some(@{docitem ''safouan''}::author)"]
subsection*["norm"::technical,main_author="Some(@{author ''safouan''}::author)"]
\<open>Normalization\<close>
text\<open>
Our framework can reason not only over infinite alphabets, but also over processes parameterized
@ -795,7 +748,7 @@ This normal form is closed under deterministic and communication operators.
The advantage of this format is that we can mimick the well-known product automata construction
for an arbitrary number of synchronized processes under normal form.
We only show the case of the synchronous product of two processes: \<close>
text*[T3::"theorem", short_name="\<open>Product Construction\<close>"]\<open>
Theorem*[T3, short_name="\<open>Product Construction\<close>", level="Some 2"]\<open>
Parallel composition translates to normal form:
@{cartouche [display,indent=5]\<open>(P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>1,\<upsilon>\<^sub>1\<rbrakk> \<sigma>\<^sub>1) || (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>2,\<upsilon>\<^sub>2\<rbrakk> \<sigma>\<^sub>2) =
P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2). \<tau>\<^sub>1 \<sigma>\<^sub>1 \<inter> \<tau>\<^sub>2 \<sigma>\<^sub>2 , \<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2).\<lambda>e.(\<upsilon>\<^sub>1 \<sigma>\<^sub>1 e, \<upsilon>\<^sub>2 \<sigma>\<^sub>2 e)\<rbrakk> (\<sigma>\<^sub>1,\<sigma>\<^sub>2)\<close>}
@ -815,7 +768,7 @@ states via the closure \<open>\<RR>\<close>, which is defined inductively over:
Thus, normalization leads to a new characterization of deadlock-freeness inspired
from automata theory. We formally proved the following theorem:\<close>
text*[T4::"theorem", short_name="\<open>DF vs. Reacheability\<close>"]
text*[T4::"theorem", short_name="\<open>DF vs. Reacheability\<close>", level="Some 2"]
\<open> If each reachable state \<open>s \<in> (\<RR> \<tau> \<upsilon>)\<close> has outgoing transitions,
the \<^csp> process is deadlock-free:
@{cartouche [display,indent=10] \<open>\<forall>\<sigma> \<in> (\<RR> \<tau> \<upsilon> \<sigma>\<^sub>0). \<tau> \<sigma> \<noteq> {} \<Longrightarrow> deadlock_free (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>,\<upsilon>\<rbrakk> \<sigma>\<^sub>0)\<close>}
@ -834,7 +787,7 @@ Summing up, our method consists of four stages:
\<close>
subsection*["dining_philosophers"::tc,main_author="Some(@{docitem ''safouan''}::author)",level="Some 3"]
subsection*["dining_philosophers"::technical,main_author="Some(@{author ''safouan''}::author)",level="Some 3"]
\<open>Generalized Dining Philosophers\<close>
text\<open> The dining philosophers problem is another paradigmatic example in the \<^csp> literature
@ -926,7 +879,7 @@ for a dozen of philosophers (on a usual machine) due to the exponential combinat
Furthermore, our proof is fairly stable against modifications like adding non synchronized events like
thinking or sitting down in contrast to model-checking techniques. \<close>
section*["relatedwork"::tc,main_author="Some(@{docitem ''lina''}::author)",level="Some 3"]
section*["relatedwork"::technical,main_author="Some(@{author ''lina''}::author)",level="Some 3"]
\<open>Related work\<close>
text\<open>
@ -993,7 +946,7 @@ restrictions on the structure of components. None of our paradigmatic examples c
be automatically proven with any of the discussed SMT techniques without restrictions.
\<close>
section*["conclusion"::conclusion,main_author="Some(@{docitem ''bu''}::author)"]\<open>Conclusion\<close>
section*["conclusion"::conclusion,main_author="Some(@{author ''bu''}::author)"]\<open>Conclusion\<close>
text\<open>We presented a formalisation of the most comprehensive semantic model for \<^csp>, a 'classical'
language for the specification and analysis of concurrent systems studied in a rich body of
literature. For this purpose, we ported @{cite "tej.ea:corrected:1997"} to a modern version
@ -1024,10 +977,6 @@ over finite sub-systems with globally infinite systems in a logically safe way.
subsection*[bib::bibliography]\<open>References\<close>
close_monitor*[this]
(*
term\<open>\<longrightarrow>\<close>
term\<open> demon \<sigma>\<^sub>g\<^sub>l\<^sub>o\<^sub>b\<^sub>a\<^sub>l := \<Sqinter> \<Delta>t \<in> \<real>\<^sub>>\<^sub>0. ||| i\<in>A. ACTOR i \<sigma>\<^sub>g\<^sub>l\<^sub>o\<^sub>b\<^sub>a\<^sub>l
\<lbrakk>S\<rbrakk> sync!\<sigma>\<^sub>g\<^sub>l\<^sub>o\<^sub>b\<^sub>a\<^sub>l\<^sub>' \<longrightarrow> demon \<sigma>\<^sub>g\<^sub>l\<^sub>o\<^sub>b\<^sub>a\<^sub>l\<^sub>' \<close>
*)
end
(*>*)

View File

@ -1,7 +1,6 @@
theory PikeOS_ST (*Security Target *)
imports "../../../src/ontologies/CC_v3.1_R5/CC_v3_1_R5"
(* Isabelle_DOF.CC_v3_1_R5 in the future. *)
imports "Isabelle_DOF-Ontologies.CC_v3_1_R5"
begin
@ -18,18 +17,20 @@ text*[pkosstref::st_ref_cls, title="''PikeOS Security Target''", st_version ="(0
It complies with the Common Criteria for Information Technology Security Evaluation
Version 3.1 Revision 4.\<close>
subsection*[pkossttoerefsubsec::st_ref_cls]\<open>TOE Reference\<close>
text*[pkostoeref::toe_ref_cls, dev_name="''''", toe_name="''PikeOS''",
toe_version= "(0,3,4)", prod_name="Some ''S3725''"]
\<open>The @{docitem toe_def} is the operating system PikeOS version 3.4
\<open>The @{docitem (unchecked) toeDef} is the operating system PikeOS version 3.4
running on the microprocessor family x86 hosting different applications.
The @{docitem toe_def} is referenced as PikeOS 3.4 base
The @{docitem (unchecked) toeDef} is referenced as PikeOS 3.4 base
product build S3725 for Linux and Windows development host with PikeOS 3.4
Certification Kit build S4250 and PikeOS 3.4 Common Criteria Kit build S4388.\<close>
subsection*[pkossttoeovrvwsubsec::st_ref_cls]\<open> TOE Overview \<close>
text*[pkosovrw1::toe_ovrw_cls]\<open>The @{definition \<open>toe\<close> } is a special kind of operating
text*[pkosovrw1::toe_ovrw_cls]\<open>The @{docitem (unchecked) \<open>toeDef\<close> } is a special kind of operating
system, that allows to effectively separate
different applications running on the same platform from each other. The TOE can host
user applications that can also be operating systems. User applications can also be
@ -87,4 +88,4 @@ open_monitor*[PikosSR::SEC_REQ_MNT]
close_monitor*[PikosSR]
close_monitor*[stpkos]
end
end

View File

@ -0,0 +1,4 @@
session "PikeOS_study" = "Isabelle_DOF-Ontologies" +
options [document = false]
theories
"PikeOS_ST"

View File

@ -0,0 +1 @@
PikeOS_study

View File

@ -1,16 +1,16 @@
session "mini_odo" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof,
dof_ontologies = "Isabelle_DOF.technical_report Isabelle_DOF.cenelec_50128",
dof_template = "Isabelle_DOF.scrreprt-modern"]
session "mini_odo" = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof]
sessions
"Physical_Quantities"
theories
"mini_odo"
document_theories
"Isabelle_DOF-Ontologies.CENELEC_50128"
document_files
"dof_session.tex"
"preamble.tex"
"root.bib"
"root.mst"
"lstisadof.sty"
"figures/df-numerics-encshaft.png"
"figures/odometer.jpeg"
"figures/three-phase-odo.pdf"

View File

@ -0,0 +1,3 @@
\input{mini_odo}
\input{CENELEC_50128}

View File

@ -13,8 +13,6 @@
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
%% This is a placeholder for user-specific configuration and packages.
\usepackage{listings}
\usepackage{lstisadof}
\usepackage{wrapfig}
\usepackage{paralist}
\usepackage{numprint}

View File

@ -15,10 +15,12 @@
theory
mini_odo
imports
"Isabelle_DOF.CENELEC_50128"
"Isabelle_DOF-Ontologies.CENELEC_50128"
"Isabelle_DOF.technical_report"
"Physical_Quantities.SI" "Physical_Quantities.SI_Pretty"
begin
use_template "scrreprt-modern"
use_ontology technical_report and "Isabelle_DOF-Ontologies.CENELEC_50128"
declare[[strict_monitor_checking=true]]
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
@ -100,13 +102,13 @@ text\<open>
functioning of the system and for its integration into the system as a whole. In
particular, we need to make the following assumptions explicit: \<^vs>\<open>-0.3cm\<close>\<close>
text*["perfect-wheel"::assumption]
text*["perfect_wheel"::assumption]
\<open>\<^item> the wheel is perfectly circular with a given, constant radius. \<^vs>\<open>-0.3cm\<close>\<close>
text*["no-slip"::assumption]
text*["no_slip"::assumption]
\<open>\<^item> the slip between the trains wheel and the track negligible. \<^vs>\<open>-0.3cm\<close>\<close>
text*["constant-teeth-dist"::assumption]
text*["constant_teeth_dist"::assumption]
\<open>\<^item> the distance between all teeth of a wheel is the same and constant, and \<^vs>\<open>-0.3cm\<close>\<close>
text*["constant-sampling-rate"::assumption]
text*["constant_sampling_rate"::assumption]
\<open>\<^item> the sampling rate of positions is a given constant.\<close>
text\<open>
@ -124,13 +126,13 @@ text\<open>
subsection\<open>Capturing ``System Architecture.''\<close>
figure*["three-phase"::figure,relative_width="70",src="''figures/three-phase-odo''"]
figure*["three_phase"::figure,relative_width="70",file_src="''figures/three-phase-odo.pdf''"]
\<open>An odometer with three sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close>.\<close>
text\<open>
The requirements analysis also contains a document \<^doc_class>\<open>SYSAD\<close>
(\<^typ>\<open>system_architecture_description\<close>) that contains technical drawing of the odometer,
a timing diagram (see \<^figure>\<open>three-phase\<close>), and tables describing the encoding of the position
a timing diagram (see \<^figure>\<open>three_phase\<close>), and tables describing the encoding of the position
for the possible signal transitions of the sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close>.
\<close>
@ -144,7 +146,7 @@ text\<open>
sub-system configuration. \<close>
(*<*)
declare_reference*["df-numerics-encshaft"::figure]
declare_reference*["df_numerics_encshaft"::figure]
(*>*)
subsection\<open>Capturing ``Required Performances.''\<close>
text\<open>
@ -158,9 +160,9 @@ text\<open>
The requirement analysis document describes the physical environment, the architecture
of the measuring device, and the required format and precision of the measurements of the odometry
function as represented (see @{figure (unchecked) "df-numerics-encshaft"}).\<close>
function as represented (see @{figure (unchecked) "df_numerics_encshaft"}).\<close>
figure*["df-numerics-encshaft"::figure,relative_width="76",src="''figures/df-numerics-encshaft''"]
figure*["df_numerics_encshaft"::figure,relative_width="76",file_src="''figures/df-numerics-encshaft.png''"]
\<open>Real distance vs. discrete distance vs. shaft-encoder sequence\<close>
@ -213,7 +215,7 @@ text\<open>
concepts such as Cauchy Sequences, limits, differentiability, and a very substantial part of
classical Calculus. \<open>SOME\<close> is the Hilbert choice operator from HOL; the definitions of the
model parameters admit all possible positive values as uninterpreted constants. Our
\<^assumption>\<open>perfect-wheel\<close> is translated into a calculation of the circumference of the
\<^assumption>\<open>perfect_wheel\<close> is translated into a calculation of the circumference of the
wheel, while \<open>\<delta>s\<^sub>r\<^sub>e\<^sub>s\<close>, the resolution of the odometer, can be calculated
from the these parameters. HOL-Analysis permits to formalize the fundamental physical observables:
\<close>
@ -294,7 +296,7 @@ and the global model parameters such as wheel diameter, the number of teeth per
sampling frequency etc., we can infer the maximal time of service as well the maximum distance
the device can measure. As an example configuration, choosing:
\<^item> \<^term>\<open>(1 *\<^sub>Q metre)::real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
\<^item> \<^term>\<open>(1 *\<^sub>Q metre):: real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
\<^item> \<^term>\<open>100 :: real\<close> for \<^term>\<open>tpw\<close> (teeth per wheel),
\<^item> \<^term>\<open>80 *\<^sub>Q kmh :: real[m\<cdot>s\<^sup>-\<^sup>1]\<close> for \<^term>\<open>Speed\<^sub>M\<^sub>a\<^sub>x\<close>,
\<^item> \<^term>\<open>14.4 *\<^sub>Q kHz :: real[s\<^sup>-\<^sup>1]\<close> for the sampling frequency,
@ -626,14 +628,14 @@ text\<open>
\<close>
text\<open>Examples for declaration of typed doc-classes "assumption" (sic!) and "hypothesis" (sic!!),
concepts defined in the underlying ontology @{theory "Isabelle_DOF.CENELEC_50128"}. \<close>
concepts defined in the underlying ontology @{theory "Isabelle_DOF-Ontologies.CENELEC_50128"}. \<close>
text*[ass2::assumption, long_name="Some ''assumption one''"] \<open> The subsystem Y is safe. \<close>
text*[hyp1::hypothesis] \<open> \<open>P \<noteq> NP\<close> \<close>
text\<open>
A real example fragment fsrom a larger project, declaring a text-element as a
A real example fragment from a larger project, declaring a text-element as a
"safety-related application condition", a concept defined in the
@{theory "Isabelle_DOF.CENELEC_50128"} ontology:\<close>
@{theory "Isabelle_DOF-Ontologies.CENELEC_50128"} ontology:\<close>
text*[hyp2::hypothesis]\<open>Under the assumption @{assumption \<open>ass2\<close>} we establish the following: ... \<close>
@ -654,11 +656,10 @@ text*[t10::test_result]
test-execution via, \<^eg>, a makefile or specific calls to a test-environment or test-engine. \<close>
text
\<open> Finally some examples of references to doc-items, i.e. text-elements
with declared meta-information and status. \<close>
text \<open> As established by @{test_result (unchecked) \<open>t10\<close>},
@{test_result (define) \<open>t10\<close>} \<close>
text \<open> Finally some examples of references to doc-items, i.e. text-elements
with declared meta-information and status. \<close>
text \<open> As established by @{test_result \<open>t10\<close>}\<close>
text \<open> the @{test_result \<open>t10\<close>}
as well as the @{SRAC \<open>ass122\<close>}\<close>
text \<open> represent a justification of the safety related applicability
@ -669,7 +670,6 @@ text \<open> due to notational conventions for antiquotations, one may even writ
"represent a justification of the safety related applicability
condition \<^SRAC>\<open>ass122\<close> aka exported constraint \<^EC>\<open>ass122\<close>."\<close>
(*<*)
end
(*>*)

View File

@ -1,3 +1,5 @@
scholarly_paper
technical_report
CENELEC_50128
cytology
CC_ISO15408
beamerx

View File

@ -0,0 +1,2 @@
poster
presentation

View File

@ -0,0 +1,8 @@
chapter AFP
session "poster-example" (AFP) = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
theories
"poster"
document_files
"preamble.tex"

View File

@ -0,0 +1,2 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -0,0 +1,39 @@
(*<*)
theory "poster"
imports "Isabelle_DOF.scholarly_paper"
"Isabelle_DOF-Ontologies.document_templates"
begin
use_template "beamerposter-UNSUPPORTED"
use_ontology "scholarly_paper"
(*>*)
title*[tit::title]\<open>Example Presentation\<close>
author*[safouan,email="\<open>example@example.org\<close>",affiliation="\<open>Example Org\<close>"]\<open>Eliza Example\<close>
text\<open>
\vfill
\begin{block}{\large Fontsizes}
\centering
{\tiny tiny}\par
{\scriptsize scriptsize}\par
{\footnotesize footnotesize}\par
{\normalsize normalsize}\par
{\large large}\par
{\Large Large}\par
{\LARGE LARGE}\par
{\veryHuge veryHuge}\par
{\VeryHuge VeryHuge}\par
{\VERYHuge VERYHuge}\par
\end{block}
\vfill
\<close>
text\<open>
@{block (title = "\<open>Title\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close>") "\<open>Block content\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close>"}
\<close>
(*<*)
end
(*>*)

View File

@ -0,0 +1,9 @@
chapter AFP
session "presentation-example" (AFP) = "Isabelle_DOF-Ontologies" +
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
theories
"presentation"
document_files
"preamble.tex"
"figures/A.png"

View File

Before

Width:  |  Height:  |  Size: 12 KiB

After

Width:  |  Height:  |  Size: 12 KiB

View File

@ -0,0 +1,2 @@
%% This is a placeholder for user-specific configuration and packages.

View File

@ -0,0 +1,69 @@
(*<*)
theory "presentation"
imports "Isabelle_DOF.scholarly_paper"
"Isabelle_DOF-Ontologies.document_templates"
begin
use_template "beamer-UNSUPPORTED"
use_ontology "scholarly_paper"
(*>*)
title*[tit::title]\<open>Example Presentation\<close>
author*[safouan,email="\<open>example@example.org\<close>",affiliation="\<open>Example Org\<close>"]\<open>Eliza Example\<close>
text\<open>
\begin{frame}
\frametitle{Example Slide}
\centering\huge This is an example!
\end{frame}
\<close>
frame*[test_frame
, frametitle = \<open>\<open>\<open>Example Slide\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close> with items @{thm "HOL.refl"}\<close>\<close>
, framesubtitle = "''Subtitle''"]
\<open>This is an example!
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> and the term encoding the title of this frame is \<^term_>\<open>frametitle @{frame \<open>test_frame\<close>}\<close>\<close>
frame*[test_frame2
, frametitle = "''Example Slide''"
, framesubtitle = \<open>\<open>\<open>Subtitle\<^sub>t\<^sub>e\<^sub>s\<^sub>t:\<close> the value of \<^term>\<open>(3::int) + 3\<close> is @{value "(3::int) + 3"}\<close>\<close>]
\<open>Test frame env \<^term>\<open>refl\<close>\<close>
frame*[test_frame3, frametitle = "''A slide with a Figure''"]
\<open>A figure
@{figure_content (width=45, caption=\<open>\<open>Figure\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close> is not the \<^term>\<open>refl\<close> theorem (@{thm "refl"}).\<close>)
"figures/A.png"}\<close>
frame*[test_frame4
, options = "''allowframebreaks''"
, frametitle = "''Example Slide with frame break''"
, framesubtitle = \<open>\<open>\<open>Subtitle\<^sub>t\<^sub>e\<^sub>s\<^sub>t:\<close> the value of \<^term>\<open>(3::int) + 3\<close> is @{value "(3::int) + 3"}\<close>\<close>]
\<open>
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> and the term encoding the title of this frame is \<^term_>\<open>frametitle @{frame \<open>test_frame4\<close>}\<close>
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<^item> The term \<^term>\<open>refl\<close> is...
\<close>
(*<*)
end
(*>*)

View File

@ -68,7 +68,7 @@ onto_class procaryotic_cells = cell +
onto_class eucaryotic_cells = cell +
organelles :: "organelles' list"
invariant has_nucleus :: "\<lambda>\<sigma>::eucaryotic_cells. \<exists> org \<in> set (organelles \<sigma>). is\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s org"
invariant has_nucleus :: "\<exists> org \<in> set (organelles \<sigma>). is\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s org"
\<comment> \<open>Cells must have at least one nucleus. However, this should be executable.\<close>
find_theorems (70)name:"eucaryotic_cells"
@ -78,13 +78,10 @@ value "is\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s (mk\<^sub>n\<^
term \<open>eucaryotic_cells.organelles\<close>
value \<open>(eucaryotic_cells.organelles(eucaryotic_cells.make X Y Z Z Z [] 3 []))\<close>
value \<open>has_nucleus_inv(eucaryotic_cells.make X Y Z Z Z [] 3 [])\<close>
value \<open>has_nucleus_inv(eucaryotic_cells.make X Y Z Z Z [] 3
[upcast\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s (nucleus.make a b c d [])])\<close>
value \<open>(eucaryotic_cells.organelles(eucaryotic_cells.make X Y Z Z Z [] []))\<close>
value \<open>has_nucleus_inv(eucaryotic_cells.make X Y Z Z Z [] [])\<close>
value \<open>has_nucleus_inv(eucaryotic_cells.make X Y Z Z Z [] [upcast\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s (nucleus.make a b c )])\<close>
end

View File

@ -0,0 +1,4 @@
session "Cytology" = "Isabelle_DOF" +
options [document = false]
theories
"Cytology"

View File

@ -1,2 +1 @@
Isabelle_DOF-Manual
TR_my_commented_isabelle

View File

@ -1,7 +1,5 @@
session "TR_MyCommentedIsabelle" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof,
dof_ontologies = "Isabelle_DOF.technical_report", dof_template = Isabelle_DOF.scrreprt,
quick_and_dirty = true]
options [document = pdf, document_output = "output", document_build = dof]
theories
"TR_MyCommentedIsabelle"
document_files

View File

@ -14,9 +14,11 @@
(*<*)
theory TR_MyCommentedIsabelle
imports "Isabelle_DOF.technical_report"
begin
use_template "scrreprt"
use_ontology "technical_report"
define_shortcut* isabelle \<rightleftharpoons> \<open>Isabelle/HOL\<close>
open_monitor*[this::report]
@ -62,7 +64,7 @@ text\<open> \<^vs>\<open>-0.5cm\<close>
Isabelle and Isabelle/HOL, a complementary text to the unfortunately somewhat outdated
"The Isabelle Cookbook" in \<^url>\<open>https://nms.kcl.ac.uk/christian.urban/Cookbook/\<close>.
The present text is also complementary to the current version of
\<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>
\<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>
"The Isabelle/Isar Implementation" by Makarius Wenzel in that it focusses on subjects
not covered there, or presents alternative explanations for which I believe, based on my
experiences with students and Phds, that they are helpful.
@ -77,7 +79,7 @@ text\<open> \<^vs>\<open>-0.5cm\<close>
maximum of formal content which makes this text re-checkable at each load and easier
maintainable. \<close>
figure*[architecture::figure,relative_width="70",src="''figures/isabelle-architecture''"]\<open>
figure*[architecture::figure,relative_width="70",file_src="''figures/isabelle-architecture.pdf''"]\<open>
The system architecture of Isabelle (left-hand side) and the asynchronous communication
between the Isabelle system and the IDE (right-hand side). \<close>
@ -146,7 +148,7 @@ text\<open> \<open>*\<open>This is a text.\<close>\<close>\<close>
text\<open>and displayed in the Isabelle/jEdit front-end at the sceen by:\<close>
figure*[fig2::figure, relative_width="60", placement="pl_h", src="''figures/text-element''"]
figure*[fig2::figure, relative_width="60", file_src="''figures/text-element.pdf''"]
\<open>A text-element as presented in Isabelle/jEdit.\<close>
text\<open>The text-commands, ML-commands (and in principle any other command) can be seen as
@ -345,7 +347,7 @@ text\<open>
\<^item> \<^ML>\<open>Context.proper_subthy : theory * theory -> bool\<close> subcontext test
\<^item> \<^ML>\<open>Context.Proof: Proof.context -> Context.generic \<close> A constructor embedding local contexts
\<^item> \<^ML>\<open>Context.proof_of : Context.generic -> Proof.context\<close> the inverse
\<^item> \<^ML>\<open>Context.theory_name : theory -> string\<close>
\<^item> \<^ML>\<open>Context.theory_name : {long:bool} -> theory -> string\<close>
\<^item> \<^ML>\<open>Context.map_theory: (theory -> theory) -> Context.generic -> Context.generic\<close>
\<close>
@ -356,7 +358,7 @@ text\<open>The structure \<^ML_structure>\<open>Proof_Context\<close> provides a
\<^item> \<^ML>\<open> Context.Proof: Proof.context -> Context.generic \<close>
the path to a generic Context, i.e. a sum-type of global and local contexts
in order to simplify system interfaces
\<^item> \<^ML>\<open> Proof_Context.get_global: theory -> string -> Proof.context\<close>
\<^item> \<^ML>\<open> Proof_Context.get_global: {long:bool} -> theory -> string -> Proof.context\<close>
\<close>
@ -364,7 +366,7 @@ subsection*[t213::example]\<open>Mechanism 2 : Extending the Global Context \<op
text\<open>A central mechanism for constructing user-defined data is by the \<^ML_functor>\<open>Generic_Data\<close>-functor.
A plugin needing some data \<^verbatim>\<open>T\<close> and providing it with implementations for an
\<^verbatim>\<open>empty\<close>, and operations \<^verbatim>\<open>merge\<close> and \<^verbatim>\<open>extend\<close>, can construct a lense with operations
\<^verbatim>\<open>empty\<close>, and operation \<^verbatim>\<open>merge\<close>, can construct a lense with operations
\<^verbatim>\<open>get\<close> and \<^verbatim>\<open>put\<close> that attach this data into the generic system context. Rather than using
unsynchronized SML mutable variables, this is the mechanism to introduce component local
data in Isabelle, which allows to manage this data for the necessary backtrack and synchronization
@ -373,14 +375,12 @@ text\<open>A central mechanism for constructing user-defined data is by the \<^M
ML \<open>
datatype X = mt
val init = mt;
val ext = I
fun merge (X,Y) = mt
structure Data = Generic_Data
(
type T = X
val empty = init
val extend = ext
val merge = merge
);
\<close>
@ -807,18 +807,13 @@ text\<open> They reflect the Pure logic depicted in a number of presentations s
Notated as logical inference rules, these operations were presented as follows:
\<close>
side_by_side_figure*["text-elements"::side_by_side_figure,anchor="''fig-kernel1''",
caption="''Pure Kernel Inference Rules I ''",relative_width="48",
src="''figures/pure-inferences-I''",anchor2="''fig-kernel2''",
caption2="''Pure Kernel Inference Rules II''",relative_width2="47",
src2="''figures/pure-inferences-II''"]\<open> \<close>
text*["text_elements"::float,
main_caption="\<open>Kernel Inference Rules.\<close>"]
\<open>
@{fig_content (width=48, caption="Pure Kernel Inference Rules I.") "figures/pure-inferences-I.pdf"
}\<^hfill>@{fig_content (width=47, caption="Pure Kernel Inference Rules II.") "figures/pure-inferences-II.pdf"}
\<close>
(*
figure*[kir1::figure,relative_width="100",src="''figures/pure-inferences-I''"]
\<open> Pure Kernel Inference Rules I.\<close>
figure*[kir2::figure,relative_width="100",src="''figures/pure-inferences-II''"]
\<open> Pure Kernel Inference Rules II. \<close>
*)
text\<open>Note that the transfer rule:
\[
@ -891,7 +886,6 @@ datatype thy = Thy of
\<^item> \<^ML>\<open>Theory.axiom_space: theory -> Name_Space.T\<close>
\<^item> \<^ML>\<open>Theory.all_axioms_of: theory -> (string * term) list\<close>
\<^item> \<^ML>\<open>Theory.defs_of: theory -> Defs.T\<close>
\<^item> \<^ML>\<open>Theory.join_theory: theory list -> theory\<close>
\<^item> \<^ML>\<open>Theory.at_begin: (theory -> theory option) -> theory -> theory\<close>
\<^item> \<^ML>\<open>Theory.at_end: (theory -> theory option) -> theory -> theory\<close>
\<^item> \<^ML>\<open>Theory.begin_theory: string * Position.T -> theory list -> theory\<close>
@ -1144,8 +1138,7 @@ text\<open>
necessary infrastructure --- i.e. the operations to pack and unpack theories and
queries on it:
\<^item> \<^ML>\<open> Toplevel.theory_toplevel: theory -> Toplevel.state\<close>
\<^item> \<^ML>\<open> Toplevel.init_toplevel: unit -> Toplevel.state\<close>
\<^item> \<^ML>\<open> Toplevel.make_state: theory option -> Toplevel.state\<close>
\<^item> \<^ML>\<open> Toplevel.is_toplevel: Toplevel.state -> bool\<close>
\<^item> \<^ML>\<open> Toplevel.is_theory: Toplevel.state -> bool\<close>
\<^item> \<^ML>\<open> Toplevel.is_proof: Toplevel.state -> bool\<close>
@ -1183,7 +1176,7 @@ text\<open> The extensibility of Isabelle as a system framework depends on a num
\<^item> \<^ML>\<open>Toplevel.theory: (theory -> theory) -> Toplevel.transition -> Toplevel.transition\<close>
adjoins a theory transformer.
\<^item> \<^ML>\<open>Toplevel.generic_theory: (generic_theory -> generic_theory) -> Toplevel.transition -> Toplevel.transition\<close>
\<^item> \<^ML>\<open>Toplevel.theory': (bool -> theory -> theory) -> Toplevel.presentation -> Toplevel.transition -> Toplevel.transition\<close>
\<^item> \<^ML>\<open>Toplevel.theory': (bool -> theory -> theory) -> Toplevel.presentation option -> Toplevel.transition -> Toplevel.transition\<close>
\<^item> \<^ML>\<open>Toplevel.exit: Toplevel.transition -> Toplevel.transition\<close>
\<^item> \<^ML>\<open>Toplevel.ignored: Position.T -> Toplevel.transition\<close>
\<^item> \<^ML>\<open>Toplevel.present_local_theory: (xstring * Position.T) option ->
@ -1201,7 +1194,6 @@ text\<open>
\<^item> \<^ML>\<open>Document.state : unit -> Document.state\<close>, giving the state as a "collection" of named
nodes, each consisting of an editable list of commands, associated with asynchronous
execution process,
\<^item> \<^ML>\<open>Session.get_keywords : unit -> Keyword.keywords\<close>, this looks to be session global,
\<^item> \<^ML>\<open>Thy_Header.get_keywords : theory -> Keyword.keywords\<close> this looks to be just theory global.
@ -1275,7 +1267,6 @@ subsection\<open>Miscellaneous\<close>
text\<open>Here are a few queries relevant for the global config of the isar engine:\<close>
ML\<open> Document.state();\<close>
ML\<open> Session.get_keywords(); (* this looks to be session global. *) \<close>
ML\<open> Thy_Header.get_keywords @{theory};(* this looks to be really theory global. *) \<close>
@ -1440,7 +1431,7 @@ text\<open>The document model forsees a number of text files, which are organize
secondary formats can be \<^verbatim>\<open>.sty\<close>,\<^verbatim>\<open>.tex\<close>, \<^verbatim>\<open>.png\<close>, \<^verbatim>\<open>.pdf\<close>, or other files processed
by Isabelle and listed in a configuration processed by the build system.\<close>
figure*[fig3::figure, relative_width="100",src="''figures/document-model''"]
figure*[fig3::figure, relative_width="100",file_src="''figures/document-model.pdf''"]
\<open>A Theory-Graph in the Document Model\<close>
text\<open>A \<^verbatim>\<open>.thy\<close> file consists of a \<^emph>\<open>header\<close>, a \<^emph>\<open>context-definition\<close> and
@ -1535,7 +1526,7 @@ text\<open> ... uses the antiquotation @{ML "@{here}"} to infer from the system
of itself in the global document, converts it to markup (a string-representation of it) and sends
it via the usual @{ML "writeln"} to the interface. \<close>
figure*[hyplinkout::figure,relative_width="40",src="''figures/markup-demo''"]
figure*[hyplinkout::figure,relative_width="40",file_src="''figures/markup-demo.png''"]
\<open>Output with hyperlinked position.\<close>
text\<open>@{figure \<open>hyplinkout\<close>} shows the produced output where the little house-like symbol in the
@ -1637,7 +1628,7 @@ val data = \<comment> \<open>Derived from Yakoub's example ;-)\<close>
, (\<open>Frédéric II\<close>, \<open>King of Sicily\<close>)
, (\<open>Frédéric III\<close>, \<open>the Handsome\<close>)
, (\<open>Frédéric IV\<close>, \<open>of the Empty Pockets\<close>)
, (\<open>Frédéric V\<close>, \<open>King of DenmarkNorway\<close>)
, (\<open>Frédéric V\<close>, \<open>King of Denmark-Norway\<close>)
, (\<open>Frédéric VI\<close>, \<open>the Knight\<close>)
, (\<open>Frédéric VII\<close>, \<open>Count of Toggenburg\<close>)
, (\<open>Frédéric VIII\<close>, \<open>Count of Zollern\<close>)
@ -1882,18 +1873,17 @@ Common Stuff related to Inner Syntax Parsing
\<^item>\<^ML>\<open>Args.internal_typ : typ parser\<close>
\<^item>\<^ML>\<open>Args.internal_term: term parser\<close>
\<^item>\<^ML>\<open>Args.internal_fact: thm list parser\<close>
\<^item>\<^ML>\<open>Args.internal_attribute: (morphism -> attribute) parser\<close>
\<^item>\<^ML>\<open>Args.internal_declaration: declaration parser\<close>
\<^item>\<^ML>\<open>Args.internal_attribute: attribute Morphism.entity parser\<close>
\<^item>\<^ML>\<open>Args.alt_name : string parser\<close>
\<^item>\<^ML>\<open>Args.liberal_name: string parser\<close>
Common Isar Syntax
\<^item>\<^ML>\<open>Args.named_source: (Token.T -> Token.src) -> Token.src parser\<close>
\<^item>\<^ML>\<open>Args.named_typ : (string -> typ) -> typ parser\<close>
\<^item>\<^ML>\<open>Args.named_term : (string -> term) -> term parser\<close>
\<^item>\<^ML>\<open>Args.embedded_declaration: (Input.source -> declaration) -> declaration parser\<close>
\<^item>\<^ML>\<open>Args.embedded_declaration: (Input.source -> Morphism.declaration_entity) ->
Morphism.declaration_entity parser\<close>
\<^item>\<^ML>\<open>Args.typ_abbrev : typ context_parser\<close>
\<^item>\<^ML>\<open>Args.typ: typ context_parser\<close>
\<^item>\<^ML>\<open>Args.term: term context_parser\<close>
@ -2153,7 +2143,7 @@ text\<open>
\<^item>\<^ML>\<open>Document_Output.output_document: Proof.context -> {markdown: bool} -> Input.source -> Latex.text \<close>
\<^item>\<^ML>\<open>Document_Output.output_token: Proof.context -> Token.T -> Latex.text \<close>
\<^item>\<^ML>\<open>Document_Output.output_source: Proof.context -> string -> Latex.text \<close>
\<^item>\<^ML>\<open>Document_Output.present_thy: Options.T -> theory -> Document_Output.segment list -> Latex.text \<close>
\<^item>\<^ML>\<open>Document_Output.present_thy: Options.T -> Keyword.keywords -> string -> Document_Output.segment list -> Latex.text \<close>
\<^item>\<^ML>\<open>Document_Output.isabelle: Proof.context -> Latex.text -> Latex.text\<close>
\<^item>\<^ML>\<open>Document_Output.isabelle_typewriter: Proof.context -> Latex.text -> Latex.text\<close>

View File

@ -10,15 +10,20 @@
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
chapter\<open>Common Criteria Definitions\<close>
chapter\<open>Common Criteria\<close>
section\<open>Terminology\<close>
(*<<*)
theory CC_terminology
imports "Isabelle_DOF.technical_report"
imports
"Isabelle_DOF.technical_report"
begin
define_ontology "DOF-CC_terminology.sty" "CC"
(*>>*)
text\<open>We re-use the class @\<open>typ math_content\<close>, which provides also a framework for
semi-formal terminology, which we re-use by this definition.\<close>
@ -35,20 +40,19 @@ type_synonym concept = concept_definition
declare[[ Definition_default_class="concept_definition"]]
(*>>*)
section \<open>Terminology\<close>
subsection \<open>Terminology\<close>
subsection \<open>Terms and definitions common in the CC\<close>
subsubsection \<open>Terms and definitions common in the CC\<close>
Definition* [aas_def, tag= "''adverse actions''"]
\<open>actions performed by a threat agent on an asset\<close>
declare_reference*[toe_def]
declare_reference*[toeDef]
Definition* [assts_def, tag="''assets''"]
\<open>entities that the owner of the @{docitem toe_def} presumably places value upon \<close>
\<open>entities that the owner of the @{docitem (unchecked) toeDef} presumably places value upon \<close>
Definition* [asgn_def, tag="''assignment''"]
\<open>the specification of an identified parameter in a component (of the CC) or requirement.\<close>
@ -56,7 +60,8 @@ Definition* [asgn_def, tag="''assignment''"]
declare_reference*[sfrs_def]
Definition* [assrc_def, tag="''assurance''"]
\<open>grounds for confidence that a @{docitem toe_def} meets the @{docitem sfrs_def}\<close>
\<open>grounds for confidence that a @{docitem (unchecked) toeDef}
meets the @{docitem (unchecked) sfrs_def}\<close>
Definition* [attptl_def, tag="''attack potential''"]
\<open>measure of the effort to be expended in attacking a TOE, expressed in terms of
@ -69,9 +74,10 @@ Definition* [authdata_def, tag="''authentication data''"]
\<open>information used to verify the claimed identity of a user\<close>
Definition* [authusr_def, tag = "''authorised user''"]
\<open>@{docitem toe_def} user who may, in accordance with the @{docitem sfrs_def}, perform an operation\<close>
\<open>@{docitem (unchecked) toeDef} user who may,
in accordance with the @{docitem (unchecked) sfrs_def}, perform an operation\<close>
Definition* [bpp_def, tag="''Base Protection Profile''"]
Definition* [bppDef, tag="''Base Protection Profile''"]
\<open>Protection Profile used as a basis to build a Protection Profile Configuration\<close>
Definition* [cls_def,tag="''class''"]
@ -104,8 +110,8 @@ Definition* [cfrm_def,tag="''confirm''"]
term is only applied to evaluator actions.\<close>
Definition* [cnnctvty_def, tag="''connectivity''"]
\<open>property of the @{docitem toe_def} allowing interaction with IT entities external to the
@{docitem toe_def}
\<open>property of the @{docitem (unchecked) toeDef} allowing interaction with IT entities external to the
@{docitem (unchecked) toeDef}
This includes exchange of data by wire or by wireless means, over any
distance in any environment or configuration.\<close>
@ -118,17 +124,20 @@ Definition* [cnt_vrb_def, tag="''counter, verb''"]
\<open>meet an attack where the impact of a particular threat is mitigated
but not necessarily eradicated\<close>
declare_reference*[st_def]
declare_reference*[pp_def]
declare_reference*[stDef]
declare_reference*[ppDef]
Definition* [dmnst_conf_def, tag="''demonstrable conformance''"]
\<open>relation between an @{docitem st_def} and a @{docitem pp_def}, where the @{docitem st_def}
\<open>relation between an @{docitem (unchecked) stDef} and a @{docitem (unchecked) ppDef},
where the @{docitem (unchecked) stDef}
provides a solution which solves the generic security problem in the PP
The @{docitem pp_def} and the @{docitem st_def} may contain entirely different statements that discuss
The @{docitem (unchecked) ppDef} and the @{docitem (unchecked) stDef} may contain
entirely different statements that discuss
different entities, use different concepts etc. Demonstrable conformance is
also suitable for a @{docitem toe_def} type where several similar @{docitem pp_def}s already exist, thus
allowing the ST author to claim conformance to these @{docitem pp_def}s simultaneously,
also suitable for a @{docitem (unchecked) toeDef} type
where several similar @{docitem (unchecked) ppDef}s already exist, thus
allowing the ST author to claim conformance to these @{docitem (unchecked) ppDef}s simultaneously,
thereby saving work.\<close>
Definition* [dmstrt_def, tag="''demonstrate''"]
@ -136,9 +145,10 @@ Definition* [dmstrt_def, tag="''demonstrate''"]
Definition* [dpndcy, tag="''dependency''"]
\<open>relationship between components such that if a requirement based on the depending
component is included in a @{docitem pp_def}, ST or package, a requirement based on
the component that is depended upon must normally also be included in the @{docitem pp_def},
@{docitem st_def} or package\<close>
component is included in a @{docitem (unchecked) ppDef}, ST or package, a requirement based on
the component that is depended upon must normally also be included
in the @{docitem (unchecked) ppDef},
@{docitem (unchecked) stDef} or package\<close>
Definition* [dscrb_def, tag="''describe''"]
\<open>provide specific details of an entity\<close>
@ -153,7 +163,7 @@ Definition* [dtrmn_def, tag="''determine''"]
performed which needs to be reviewed\<close>
Definition* [devenv_def, tag="''development environment''"]
\<open>environment in which the @{docitem toe_def} is developed\<close>
\<open>environment in which the @{docitem (unchecked) toeDef} is developed\<close>
Definition* [elmnt_def, tag="''element''"]
\<open>indivisible statement of a security need\<close>
@ -165,26 +175,27 @@ Definition* [ensr_def, tag="''ensure''"]
consequence is not fully certain, on the basis of that action alone.\<close>
Definition* [eval_def, tag="''evaluation''"]
\<open>assessment of a @{docitem pp_def}, an @{docitem st_def} or a @{docitem toe_def},
against defined criteria.\<close>
\<open>assessment of a @{docitem (unchecked) ppDef}, an @{docitem (unchecked) stDef}
or a @{docitem (unchecked) toeDef}, against defined criteria.\<close>
Definition* [eal_def, tag= "''evaluation assurance level''"]
\<open>set of assurance requirements drawn from CC Part 3, representing a point on the
CC predefined assurance scale, that form an assurance package\<close>
Definition* [eval_auth_def, tag="''evaluation authority''"]
\<open>body that sets the standards and monitors the quality of evaluations conducted by bodies within a specific community and
implements the CC for that community by means of an evaluation scheme\<close>
\<open>body that sets the standards and monitors the quality of evaluations conducted
by bodies within a specific community and implements the CC for that community
by means of an evaluation scheme\<close>
Definition* [eval_schm_def, tag="''evaluation scheme''"]
\<open>administrative and regulatory framework under which the CC is applied by an
evaluation authority within a specific community\<close>
Definition* [exst_def, tag="''exhaustive''"]
Definition* [exstDef, tag="''exhaustive''"]
\<open>characteristic of a methodical approach taken to perform an
analysis or activity according to an unambiguous plan
This term is used in the CC with respect to conducting an analysis or other
activity. It is related to “systematic” but is considerably stronger, in that it
activity. It is related to ``systematic'' but is considerably stronger, in that it
indicates not only that a methodical approach has been taken to perform the
analysis or activity according to an unambiguous plan, but that the plan that
was followed is sufficient to ensure that all possible avenues have been
@ -235,7 +246,7 @@ Definition* [intl_com_chan_def, tag ="''internal communication channel''"]
Definition* [int_toe_trans, tag="''internal TOE transfer''"]
\<open>communicating data between separated parts of the TOE\<close>
Definition* [inter_consist_def, tag="''internally consistent''"]
Definition* [inter_consistDef, tag="''internally consistent''"]
\<open>no apparent contradictions exist between any aspects of an entity
In terms of documentation, this means that there can be no statements within
@ -270,7 +281,7 @@ Definition* [org_sec_po_def, tag="''organisational security policy''"]
Definition* [pckg_def, tag="''package''"]
\<open>named set of either security functional or security assurance requirements
An example of a package is “EAL 3”.\<close>
An example of a package is ``EAL 3''.\<close>
Definition* [pp_config_def, tag="''Protection Profile Configuration''"]
\<open>Protection Profile composed of Base Protection Profiles and Protection Profile Module\<close>
@ -278,7 +289,7 @@ Definition* [pp_config_def, tag="''Protection Profile Configuration''"]
Definition* [pp_eval_def, tag="''Protection Profile evaluation''"]
\<open> assessment of a PP against defined criteria \<close>
Definition* [pp_def, tag="''Protection Profile''"]
Definition* [ppDef, tag="''Protection Profile''"]
\<open>implementation-independent statement of security needs for a TOE type\<close>
Definition* [ppm_def, tag="''Protection Profile Module''"]
@ -290,36 +301,37 @@ declare_reference*[tsf_def]
Definition* [prv_def, tag="''prove''"]
\<open>show correspondence by formal analysis in its mathematical sense
It is completely rigorous in all ways. Typically, “prove” is used when there is
a desire to show correspondence between two @{docitem tsf_def} representations at a high
level of rigour.\<close>
a desire to show correspondence between two @{docitem (unchecked) tsf_def}
representations at a high level of rigour.\<close>
Definition* [ref_def, tag="''refinement''"]
\<open>addition of details to a component\<close>
Definition* [role_def, tag="''role''"]
\<open>predefined set of rules establishing the allowed interactions between
a user and the @{docitem toe_def}\<close>
a user and the @{docitem (unchecked) toeDef}\<close>
declare_reference*[sfp_def]
Definition* [scrt_def, tag="''secret''"]
\<open>information that must be known only to authorised users and/or the
@{docitem tsf_def} in order to enforce a specific @{docitem sfp_def}\<close>
@{docitem (unchecked) tsf_def} in order to enforce a specific @{docitem (unchecked) sfp_def}\<close>
declare_reference*[sfr_def]
Definition* [sec_st_def, tag="''secure state''"]
\<open>state in which the @{docitem tsf_def} data are consistent and the @{docitem tsf_def}
continues correct enforcement of the @{docitem sfr_def}s\<close>
Definition* [sec_stDef, tag="''secure state''"]
\<open>state in which the @{docitem (unchecked) tsf_def} data are consistent
and the @{docitem (unchecked) tsf_def}
continues correct enforcement of the @{docitem (unchecked) sfr_def}s\<close>
Definition* [sec_att_def, tag="''security attribute''"]
\<open>property of subjects, users (including external IT products), objects,
information, sessions and/or resources that is used in defining the @{docitem sfr_def}s
and whose values are used in enforcing the @{docitem sfr_def}s\<close>
information, sessions and/or resources that is used in defining the @{docitem (unchecked) sfr_def}s
and whose values are used in enforcing the @{docitem (unchecked) sfr_def}s\<close>
Definition* [sec_def, tag="''security''"]
\<open>function policy set of rules describing specific security behaviour enforced
by the @{docitem tsf_def} and expressible as a set of @{docitem sfr_def}s\<close>
by the @{docitem (unchecked) tsf_def} and expressible as a set of @{docitem (unchecked) sfr_def}s\<close>
Definition* [sec_obj_def, tag="''security objective''"]
\<open>statement of an intent to counter identified threats and/or satisfy identified
@ -328,18 +340,21 @@ Definition* [sec_obj_def, tag="''security objective''"]
Definition* [sec_prob_def, tag ="''security problem''"]
\<open>statement which in a formal manner defines the nature and scope of the security that
the TOE is intended to address This statement consists of a combination of:
\begin{itemize}
\item threats to be countered by the TOE and its operational environment,
\item the OSPs enforced by the TOE and its operational environment, and
\item the assumptions that are upheld for the operational environment of the TOE.
\end{itemize}\<close>
\begin{itemize}
\item threats to be countered by the TOE and its operational environment,
\item the OSPs enforced by the TOE and its operational environment, and
\item the assumptions that are upheld for the operational environment of the TOE.
\end{itemize}\<close>
Definition* [sr_def, tag="''security requirement''", short_tag="Some(''SR'')"]
\<open>requirement, stated in a standardised language, which is meant to contribute
to achieving the security objectives for a TOE\<close>
text \<open>@{docitem toe_def}\<close>
Definition* [st, tag="''Security Target''", short_tag="Some(''ST'')"]
\<open>implementation-dependent statement of security needs for a specific i\<section>dentified @{docitem toe_def}\<close>
(*<*)
text \<open>@{docitem (unchecked) toeDef}\<close>
(*>*)
Definition* [st, tag="''Security Target''", short_tag="Some(''ST'')"]
\<open>implementation-dependent statement of security needs for a specific identified
@{docitem (unchecked) toeDef}\<close>
Definition* [slct_def, tag="''selection''"]
\<open>specification of one or more items from a list in a component\<close>
@ -379,13 +394,13 @@ Definition* [toe_res_def, tag="''TOE resource''"]
Definition* [toe_sf_def, tag="''TOE security functionality''", short_tag= "Some(''TSF'')"]
\<open>combined functionality of all hardware, software, and firmware of a TOE that must be relied upon
for the correct enforcement of the @{docitem sfr_def}s\<close>
for the correct enforcement of the @{docitem (unchecked) sfr_def}s\<close>
Definition* [tr_vrb_def, tag="''trace, verb''"]
\<open>perform an informal correspondence analysis between two entities with only a
minimal level of rigour\<close>
Definition* [trnsfs_out_toe_def, tag="''transfers outside of the TOE''"]
Definition* [trnsfs_out_toeDef, tag="''transfers outside of the TOE''"]
\<open>TSF mediated communication of data to entities not under the control of the TSF\<close>
Definition* [transl_def, tag= "''translation''"]
@ -430,13 +445,22 @@ effort is required of the evaluator.\<close>
Definition* [dev_def, tag="''Developer''"]
\<open>who respond to actual or perceived consumer security requirements in
constructing a @{docitem toe_def}, reference this CC_Part_3
constructing a @{docitem (unchecked) toeDef}, reference this CC\_Part\_3
when interpreting statements of assurance requirements and determining
assurance approaches of @{docitem toe}s.\<close>
Definition*[evalu_def, tag="'' Evaluator''"]
\<open>who use the assurance requirements defined in CC_Part_3
\<open>who use the assurance requirements defined in CC\_Part\_3
as mandatory statement of evaluation criteria when determining the assurance
of @{docitem toe_def}s and when evaluating @{docitem pp_def}s and @{docitem st_def}s.\<close>
of @{docitem (unchecked) toeDef}s and when evaluating @{docitem ppDef}s
and @{docitem (unchecked) stDef}s.\<close>
Definition*[toeDef] \<open>\<close>
Definition*[sfrs_def] \<open>\<close>
Definition*[sfr_def] \<open>\<close>
Definition*[stDef] \<open>\<close>
Definition*[sfp_def] \<open>\<close>
Definition*[tsf_def] \<open>\<close>
end

View File

@ -10,16 +10,18 @@
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
section\<open>CC 3.1.R5\<close>
(*<*)
theory "CC_v3_1_R5"
imports "Isabelle_DOF.technical_report"
imports
"Isabelle_DOF.technical_report"
"CC_terminology"
begin
(*>*)
section \<open>General Infrastructure on CC Evaluations\<close>
subsection \<open>General Infrastructure on CC Evaluations\<close>
datatype EALs = EAL1 | EAL2 | EAL3 | EAL4 | EAL5 | EAL6 | EAL7
@ -30,7 +32,7 @@ doc_class CC_structure_element =(* text_element + *)
doc_class CC_text_element = text_element +
eval_level :: EALs
section \<open>Security target ontology\<close>
subsection \<open>Security target ontology\<close>
doc_class st_ref_cls = CC_text_element +
@ -53,12 +55,11 @@ doc_class toe_ovrw_cls = CC_text_element +
firmeware_req:: "CC_text_element list" <= "[]"
features_req :: "CC_text_element list" <= "[]"
invariant eal_consistency::
"\<lambda> X::toe_ovrw_cls .
(case eval_level X of
EAL1 \<Rightarrow> software_req X \<noteq> []
| EAL2 \<Rightarrow> software_req X \<noteq> []
| EAL3 \<Rightarrow> software_req X \<noteq> []
| EAL4 \<Rightarrow> software_req X \<noteq> []
"(case eval_level \<sigma> of
EAL1 \<Rightarrow> software_req \<sigma> \<noteq> []
| EAL2 \<Rightarrow> software_req \<sigma> \<noteq> []
| EAL3 \<Rightarrow> software_req \<sigma> \<noteq> []
| EAL4 \<Rightarrow> software_req \<sigma> \<noteq> []
| _ \<Rightarrow> undefined)"
thm eal_consistency_inv_def

View File

@ -0,0 +1,57 @@
%% Copyright (C) University of Exeter
%% University of Paris-Saclay
%%
%% License:
%% This program can be redistributed and/or modified under the terms
%% of the LaTeX Project Public License Distributed from CTAN
%% archives in directory macros/latex/base/lppl.txt; either
%% version 1.3c of the License, or (at your option) any later version.
%% OR
%% The 2-clause BSD-style license.
%%
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
\NeedsTeXFormat{LaTeX2e}\relax
\ProvidesPackage{DOF-CC_terminology}
[00/00/0000 Document-Type Support Framework for Isabelle (CC).]
\RequirePackage{DOF-COL}
\usepackage{etex}
\ifdef{\reserveinserts}{\reserveinserts{28}}{}
\newkeycommand*{\mathcc}[label=,type=%
, scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTshortUNDERSCOREname ={}%
, scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc = %
, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel =%
, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable =%
, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants =%
, scholarlyUNDERSCOREpaperDOTtextUNDERSCOREsectionDOTmainUNDERSCOREauthor =%
, scholarlyUNDERSCOREpaperDOTtextUNDERSCOREsectionDOTfixmeUNDERSCORElist =%
, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel =%
, scholarlyUNDERSCOREpaperDOTtechnicalDOTdefinitionUNDERSCORElist =%
, scholarlyUNDERSCOREpaperDOTtechnicalDOTstatus =%
, CCUNDERSCOREterminologyDOTconceptUNDERSCOREdefinitionDOTtag=%
, CCUNDERSCOREterminologyDOTconceptUNDERSCOREdefinitionDOTshortUNDERSCOREtag=%
]
[1]
{%
\begin{isamarkuptext}%
\ifthenelse{\equal{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTshortUNDERSCOREname}} {} }
{%
\begin{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc}}\label{\commandkey{label}}
#1
\end{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc}}
}{%
\begin{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc}}[\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTshortUNDERSCOREname}]\label{\commandkey{label}}
#1
\end{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc}}
}
\end{isamarkuptext}%
}
\expandafter\def\csname isaDofDOTtextDOTscholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent\endcsname{\mathcc}

View File

@ -24,10 +24,14 @@ identifies:
\<close>
(*<<*)
theory CENELEC_50128
imports "Isabelle_DOF.technical_report"
theory
CENELEC_50128
imports
"Isabelle_DOF.technical_report"
begin
define_ontology "DOF-CENELEC_50128.sty" "CENELEC 50128"
(* this is a hack and should go into an own ontology, providing thingsd like:
- Assumption*
- Hypothesis*
@ -155,10 +159,10 @@ which have the required safety integrity level.\<close>
Definition*[entity]
\<open>person, group or organisation who fulfils a role as defined in this European Standard.\<close>
declare_reference*[fault]
declare_reference*[fault::cenelec_term]
Definition*[error]
\<open>defect, mistake or inaccuracy which could result in failure or in a deviation
from the intended performance or behaviour (cf. @{cenelec_term (unchecked) \<open>fault\<close>})).\<close>
from the intended performance or behaviour (cf. @{cenelec_term (unchecked) \<open>fault\<close>}).\<close>
Definition*[fault]
\<open>defect, mistake or inaccuracy which could result in failure or in a deviation
@ -521,9 +525,11 @@ text\<open>Figure 3 in Chapter 5: Illustrative Development Lifecycle 1\<close>
text\<open>Global Overview\<close>
(*
figure*[fig3::figure, relative_width="100",
src="''examples/CENELEC_50128/mini_odo/document/figures/CENELEC-Fig.3-docStructure.png''"]
\<open>Illustrative Development Lifecycle 1\<close>
*)
text\<open>Actually, the Figure 4 in Chapter 5: Illustrative Development Lifecycle 2 is more fidele
to the remaining document: Software Architecture and Design phases are merged, like in 7.3.\<close>
@ -614,9 +620,10 @@ doc_class cenelec_report = text_element +
invariant must_be_chapter :: "text_element.level \<sigma> = Some(0)"
invariant three_eyes_prcpl:: " written_by \<sigma> \<noteq> fst_check \<sigma>
\<and> written_by \<sigma> \<noteq> snd_check \<sigma>"
(*
text\<open>see \<^figure>\<open>fig3\<close> and Fig 4 in Chapter 5: Illustrative Development Lifecycle 2\<close>
*)
doc_class external_specification =
phase :: "phase" <= "SYSDEV_ext"
@ -1007,14 +1014,16 @@ ML\<open>
fun check_sil oid _ ctxt =
let
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
val DOF_core.Instance {value = monitor_record_value, ...} =
DOF_core.get_instance_global oid (Context.theory_of ctxt)
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
fun check_sil'' [] = true
| check_sil'' (x::xs) =
let
val (_, doc_oid) = x
val doc_record_value = #value (the (DOF_core.get_object_local doc_oid ctxt'))
val DOF_core.Instance {value = doc_record_value, ...} =
DOF_core.get_instance_global doc_oid (Context.theory_of ctxt)
val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext
in
@ -1026,27 +1035,37 @@ fun check_sil oid _ ctxt =
in check_sil'' traces end
\<close>
setup\<open>DOF_core.update_class_invariant "CENELEC_50128.monitor_SIL0" check_sil\<close>
setup\<open>
(fn thy =>
let val ctxt = Proof_Context.init_global thy
val cid = "monitor_SIL0"
val binding = DOF_core.binding_from_onto_class_pos cid thy
val cid_long = DOF_core.get_onto_class_name_global cid thy
in DOF_core.add_ml_invariant binding (DOF_core.make_ml_invariant (check_sil, cid_long)) thy end)
\<close>
text\<open>
A more generic example of check_sil which can be generalized:
A more generic example of check\_sil which can be generalized:
it is decoupled from the CENELEC current implementation
but is much less efficient regarding time computation by relying on Isabelle evaluation mechanism.\<close>
ML\<open>
fun check_sil_slow oid _ ctxt =
let
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
val monitor_cid = #cid (the (DOF_core.get_object_local oid ctxt'))
val DOF_core.Instance {value = monitor_record_value, ...} =
DOF_core.get_instance_global oid (Context.theory_of ctxt)
val DOF_core.Instance {cid = monitor_cid, ...} =
DOF_core.get_instance_global oid (Context.theory_of ctxt)
val monitor_sil_typ = (Syntax.read_typ ctxt' monitor_cid) --> @{typ "sil"}
val monitor_sil = Value_Command.value ctxt'
(Const("CENELEC_50128.monitor_SIL.sil", monitor_sil_typ) $ monitor_record_value)
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
fun check_sil' [] = true
| check_sil' (x::xs) =
let
val (doc_cid, doc_oid) = x
val doc_record_value = #value (the (DOF_core.get_object_local doc_oid ctxt'))
val DOF_core.Instance {value = doc_record_value, ...} =
DOF_core.get_instance_global doc_oid (Context.theory_of ctxt)
val doc_sil_typ = (Syntax.read_typ ctxt' doc_cid) --> @{typ "sil"}
val doc_sil = Value_Command.value ctxt'
(Const ("CENELEC_50128.cenelec_document.sil", doc_sil_typ) $ doc_record_value)
@ -1059,20 +1078,25 @@ fun check_sil_slow oid _ ctxt =
in check_sil' traces end
\<close>
(*setup\<open>DOF_core.update_class_invariant "CENELEC_50128.monitor_SIL0" check_sil_slow\<close>*)
(*setup\<open>
(fn thy =>
let val ctxt = Proof_Context.init_global thy
val binding = DOF_core.binding_from_onto_class_pos "monitor_SIL0" thy
in DOF_core.add_ml_invariant binding check_sil_slow thy end)
\<close>*)
(* As traces of monitor instances (docitems) are updated each time an instance is declared
(with text*, section*, etc.), invariants checking functions which use traces must
be declared as lazy invariants, to be checked only when closing a monitor, i.e.,
after the monitor traces are populated.
(with text*, section*, etc.), invariants checking functions which check the full list of traces
must be declared as lazy invariants, to be checked only when closing a monitor, i.e.,
after all the monitor traces are populated.
*)
ML\<open>
fun check_required_documents oid _ ctxt =
let
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
val {monitor_tab,...} = DOF_core.get_data ctxt'
val {accepted_cids, ...} = the (Symtab.lookup monitor_tab oid)
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
val DOF_core.Monitor_Info {accepted_cids, ...} =
DOF_core.get_monitor_info_global oid (Context.theory_of ctxt)
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
fun check_required_documents' [] = true
| check_required_documents' (cid::cids) =
if exists (fn (doc_cid, _) => equal cid doc_cid) traces
@ -1080,7 +1104,8 @@ fun check_required_documents oid _ ctxt =
else
let
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
val DOF_core.Instance {value = monitor_record_value, ...} =
DOF_core.get_instance_global oid (Context.theory_of ctxt)
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
in error ("A " ^ cid ^ " cenelec document is required with "
^ Syntax.string_of_term ctxt' monitor_sil)
@ -1088,7 +1113,15 @@ fun check_required_documents oid _ ctxt =
in check_required_documents' accepted_cids end
\<close>
setup\<open>DOF_core.update_class_lazy_invariant "CENELEC_50128.monitor_SIL0" check_required_documents\<close>
setup\<open>
fn thy =>
let val ctxt = Proof_Context.init_global thy
val cid = "monitor_SIL0"
val binding = DOF_core.binding_from_onto_class_pos cid thy
val cid_long = DOF_core.get_onto_class_name_global cid thy
in DOF_core.add_closing_ml_invariant binding
(DOF_core.make_ml_invariant (check_required_documents, cid_long)) thy end
\<close>
(* Test pattern matching for the records of the current CENELEC implementation classes,
and used by checking functions.
@ -1099,11 +1132,11 @@ text*[MonitorPatternMatchingTest::monitor_SIL0]\<open>\<close>
text*[CenelecClassPatternMatchingTest::SQAP, sil = "SIL0"]\<open>\<close>
ML\<open>
val thy = @{theory}
val monitor_record_value =
#value (the (DOF_core.get_object_global "MonitorPatternMatchingTest" thy))
val DOF_core.Instance {value = monitor_record_value, ...} =
DOF_core.get_instance_global "MonitorPatternMatchingTest" thy
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
val doc_record_value = #value (the (DOF_core.get_object_global
"CenelecClassPatternMatchingTest" thy))
val DOF_core.Instance {value = doc_record_value, ...} =
DOF_core.get_instance_global "CenelecClassPatternMatchingTest" thy
val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext
\<close>
@ -1236,15 +1269,16 @@ doc_class test_documentation = (* OUTDATED ? *)
section\<open>Global Documentation Structure\<close>
(*<<*)
doc_class global_documentation_structure = text_element +
level :: "int option" <= "Some(-1::int)" \<comment> \<open>document must be a chapter\<close>
accepts "SYSREQS ~~ \<comment> \<open>system_requirements_specification\<close>
SYSSREQS ~~ \<comment> \<open>system_safety_requirements_specification\<close>
SYSAD ~~ \<comment> \<open>system_architecture description\<close>
accepts "SYSREQS ~~ \<comment> \<open>system requiremens specification\<close>
SYSSREQS ~~ \<comment> \<open>system safety requirements specification\<close>
SYSAD ~~ \<comment> \<open>system architecture description\<close>
SYSS_pl ~~ \<comment> \<open>system safety plan\<close>
(SWRS || OSWTS) " \<comment> \<open>software requirements specification OR
overall software test specification\<close>
(*>>*)
(* MORE TO COME : *)
section\<open> META : Testing and Validation \<close>
@ -1252,10 +1286,10 @@ section\<open> META : Testing and Validation \<close>
text\<open>Test : @{semi_formal_content \<open>COTS\<close>}\<close>
ML
\<open> DOF_core.read_cid_global @{theory} "requirement";
DOF_core.read_cid_global @{theory} "SRAC";
DOF_core.is_defined_cid_global "SRAC" @{theory};
DOF_core.is_defined_cid_global "EC" @{theory}; \<close>
\<open> DOF_core.get_onto_class_name_global "requirement" @{theory};
DOF_core.get_onto_class_name_global "SRAC" @{theory};
DOF_core.get_onto_class_global "SRAC" @{theory};
DOF_core.get_onto_class_global "EC" @{theory}; \<close>
ML
\<open> DOF_core.is_subclass @{context} "CENELEC_50128.EC" "CENELEC_50128.EC";
@ -1264,18 +1298,19 @@ ML
DOF_core.is_subclass @{context} "CENELEC_50128.EC" "CENELEC_50128.test_requirement"; \<close>
ML
\<open> val {docobj_tab={maxano, tab=ref_tab},docclass_tab=class_tab,...} = DOF_core.get_data @{context};
Symtab.dest ref_tab;
Symtab.dest class_tab; \<close>
\<open> val ref_tab = DOF_core.get_instances \<^context>
val docclass_tab = DOF_core.get_onto_classes @{context};
Name_Space.dest_table ref_tab;
Name_Space.dest_table docclass_tab; \<close>
ML
\<open> val internal_data_of_SRAC_definition = DOF_core.get_attributes_local "SRAC" @{context} \<close>
ML
\<open> DOF_core.read_cid_global @{theory} "requirement";
\<open> DOF_core.get_onto_class_name_global "requirement" @{theory};
Syntax.parse_typ @{context} "requirement";
val Type(t,_) = Syntax.parse_typ @{context} "requirement" handle ERROR _ => dummyT;
Syntax.read_typ @{context} "hypothesis" handle _ => dummyT;
Proof_Context.init_global; \<close>
end
end

View File

@ -0,0 +1,397 @@
(*************************************************************************
* Copyright (C)
* 2019-2023 The University of Exeter
* 2018-2023 The University of Paris-Saclay
* 2018 The University of Sheffield
*
* License:
* This program can be redistributed and/or modified under the terms
* of the 2-clause BSD-style license.
*
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
(*<<*)
theory
CENELEC_50128_Documentation
imports
CENELEC_50128
begin
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
define_shortcut* TeXLive \<rightleftharpoons> \<open>\TeXLive\<close>
BibTeX \<rightleftharpoons> \<open>\BibTeX{}\<close>
LaTeX \<rightleftharpoons> \<open>\LaTeX{}\<close>
TeX \<rightleftharpoons> \<open>\TeX{}\<close>
pdf \<rightleftharpoons> \<open>PDF\<close>
ML\<open>
fun boxed_text_antiquotation name (* redefined in these more abstract terms *) =
DOF_lib.gen_text_antiquotation name DOF_lib.report_text
(fn ctxt => DOF_lib.string_2_text_antiquotation ctxt
#> DOF_lib.enclose_env false ctxt "isarbox")
val neant = K(Latex.text("",\<^here>))
fun boxed_theory_text_antiquotation name (* redefined in these more abstract terms *) =
DOF_lib.gen_text_antiquotation name DOF_lib.report_theory_text
(fn ctxt => DOF_lib.string_2_theory_text_antiquotation ctxt
#> DOF_lib.enclose_env false ctxt "isarbox"
(* #> neant *)) (*debugging *)
fun boxed_sml_text_antiquotation name =
DOF_lib.gen_text_antiquotation name (K(K()))
(fn ctxt => Input.source_content
#> Latex.text
#> DOF_lib.enclose_env true ctxt "sml")
(* the simplest conversion possible *)
fun boxed_pdf_antiquotation name =
DOF_lib.gen_text_antiquotation name (K(K()))
(fn ctxt => Input.source_content
#> Latex.text
#> DOF_lib.enclose_env true ctxt "out")
(* the simplest conversion possible *)
fun boxed_latex_antiquotation name =
DOF_lib.gen_text_antiquotation name (K(K()))
(fn ctxt => Input.source_content
#> Latex.text
#> DOF_lib.enclose_env true ctxt "ltx")
(* the simplest conversion possible *)
fun boxed_bash_antiquotation name =
DOF_lib.gen_text_antiquotation name (K(K()))
(fn ctxt => Input.source_content
#> Latex.text
#> DOF_lib.enclose_env true ctxt "bash")
(* the simplest conversion possible *)
\<close>
setup\<open>(* std_text_antiquotation \<^binding>\<open>my_text\<close> #> *)
boxed_text_antiquotation \<^binding>\<open>boxed_text\<close> #>
(* std_text_antiquotation \<^binding>\<open>my_cartouche\<close> #> *)
boxed_text_antiquotation \<^binding>\<open>boxed_cartouche\<close> #>
(* std_theory_text_antiquotation \<^binding>\<open>my_theory_text\<close>#> *)
boxed_theory_text_antiquotation \<^binding>\<open>boxed_theory_text\<close> #>
boxed_sml_text_antiquotation \<^binding>\<open>boxed_sml\<close> #>
boxed_pdf_antiquotation \<^binding>\<open>boxed_pdf\<close> #>
boxed_latex_antiquotation \<^binding>\<open>boxed_latex\<close>#>
boxed_bash_antiquotation \<^binding>\<open>boxed_bash\<close>
\<close>
(*>>*)
section*[cenelec_onto::example]\<open>Writing Certification Documents \<^boxed_theory_text>\<open>CENELEC_50128\<close>\<close>
subsection\<open>The CENELEC 50128 Example\<close>
text\<open>
The ontology \<^verbatim>\<open>CENELEC_50128\<close>\index{ontology!CENELEC\_50128} is a small ontology modeling
documents for a certification following CENELEC 50128~@{cite "boulanger:cenelec-50128:2015"}.
The \<^isadof> distribution contains a small example using the ontology ``CENELEC\_50128'' in
the directory \nolinkurl{examples/CENELEC_50128/mini_odo/}. You can inspect/edit the
integrated source example by either
\<^item> starting Isabelle/jEdit using your graphical user interface (\<^eg>, by clicking on the
Isabelle-Icon provided by the Isabelle installation) and loading the file
\nolinkurl{examples/CENELEC_50128/mini_odo/mini_odo.thy}.
\<^item> starting Isabelle/jEdit from the command line by calling:
@{boxed_bash [display]\<open>ë\prompt{\isadofdirn}ë
isabelle jedit examples/CENELEC_50128/mini_odo/mini_odo.thy \<close>}
\<close>
text\<open>\<^noindent> Finally, you
\<^item> can build the \<^pdf>-document by calling:
@{boxed_bash [display]\<open>ë\prompt{\isadofdirn}ë isabelle build mini_odo \<close>}
\<close>
subsection\<open>Modeling CENELEC 50128\<close>
text\<open>
Documents to be provided in formal certifications (such as CENELEC
50128~@{cite "boulanger:cenelec-50128:2015"} or Common Criteria~@{cite "cc:cc-part3:2006"}) can
much profit from the control of ontological consistency: a substantial amount of the work
of evaluators in formal certification processes consists in tracing down the links from
requirements over assumptions down to elements of evidence, be it in form of semi-formal
documentation, models, code, or tests. In a certification process, traceability becomes a major
concern; and providing mechanisms to ensure complete traceability already at the development of
the integrated source can in our view increase the speed and reduce the risk certification
processes. Making the link-structure machine-checkable, be it between requirements, assumptions,
their implementation and their discharge by evidence (be it tests, proofs, or authoritative
arguments), has the potential in our view to decrease the cost of software developments
targeting certifications.
As in many other cases, formal certification documents come with an own terminology and pragmatics
of what has to be demonstrated and where, and how the traceability of requirements through
design-models over code to system environment assumptions has to be assured.
In the sequel, we present a simplified version of an ontological model used in a
case-study~@{cite "bezzecchi.ea:making:2018"}. We start with an introduction of the concept of
requirement:
@{boxed_theory_text [display]\<open>
doc_class requirement = long_name :: "string option"
doc_class hypothesis = requirement +
hyp_type :: hyp_type <= physical (* default *)
datatype ass_kind = informal | semiformal | formal
doc_class assumption = requirement +
assumption_kind :: ass_kind <= informal
\<close>}
Such ontologies can be enriched by larger explanations and examples, which may help
the team of engineers substantially when developing the central document for a certification,
like an explication of what is precisely the difference between an \<^typ>\<open>hypothesis\<close> and an
\<^typ>\<open>assumption\<close> in the context of the evaluation standard. Since the PIDE makes for each
document class its definition available by a simple mouse-click, this kind on meta-knowledge
can be made far more accessible during the document evolution.
For example, the term of category \<^typ>\<open>assumption\<close> is used for domain-specific assumptions.
It has \<^const>\<open>formal\<close>, \<^const>\<open>semiformal\<close> and \<^const>\<open>informal\<close> sub-categories. They have to be
tracked and discharged by appropriate validation procedures within a
certification process, be it by test or proof. It is different from a \<^typ>\<open>hypothesis\<close>, which is
globally assumed and accepted.
In the sequel, the category \<^typ>\<open>exported_constraint\<close> (or \<^typ>\<open>EC\<close> for short)
is used for formal assumptions, that arise during the analysis,
design or implementation and have to be tracked till the final
evaluation target, and discharged by appropriate validation procedures
within the certification process, be it by test or proof. A particular class of interest
is the category \<^typ>\<open>safety_related_application_condition\<close> (or \<^typ>\<open>SRAC\<close>
for short) which is used for \<^typ>\<open>EC\<close>'s that establish safety properties
of the evaluation target. Their traceability throughout the certification
is therefore particularly critical. This is naturally modeled as follows:
@{boxed_theory_text [display]\<open>
doc_class EC = assumption +
assumption_kind :: ass_kind <= (*default *) formal
doc_class SRAC = EC +
assumption_kind :: ass_kind <= (*default *) formal
\<close>}
We now can, \<^eg>, write
@{boxed_theory_text [display]\<open>
text*[ass123::SRAC]\<open>
The overall sampling frequence of the odometer subsystem is therefore
14 khz, which includes sampling, computing and result communication
times \ldots
\<close>
\<close>}
This will be shown in the \<^pdf> as follows:
\<close>
text*[ass123::SRAC] \<open> The overall sampling frequency of the odometer
subsystem is therefore 14 khz, which includes sampling, computing and
result communication times \ldots \<close>
text\<open>Note that this \<^pdf>-output is the result of a specific setup for \<^typ>\<open>SRAC\<close>s.\<close>
subsection*[ontopide::technical]\<open>Editing Support for CENELEC 50128\<close>
figure*[figfig3::figure,relative_width="95",file_src="''figures/antiquotations-PIDE.png''"]
\<open> Standard antiquotations referring to theory elements.\<close>
text\<open> The corresponding view in @{docitem \<open>figfig3\<close>} shows core part of a document
conforming to the \<^verbatim>\<open>CENELEC_50128\<close> ontology. The first sample shows standard Isabelle antiquotations
@{cite "wenzel:isabelle-isar:2020"} into formal entities of a theory. This way, the informal parts
of a document get ``formal content'' and become more robust under change.\<close>
figure*[figfig5::figure, relative_width="95", file_src="''figures/srac-definition.png''"]
\<open> Defining a \<^typ>\<open>SRAC\<close> in the integrated source ... \<close>
figure*[figfig7::figure, relative_width="95", file_src="''figures/srac-as-es-application.png''"]
\<open> Using a \<^typ>\<open>SRAC\<close> as \<^typ>\<open>EC\<close> document element. \<close>
text\<open> The subsequent sample in @{figure \<open>figfig5\<close>} shows the definition of a
\<^emph>\<open>safety-related application condition\<close>, a side-condition of a theorem which
has the consequence that a certain calculation must be executed sufficiently fast on an embedded
device. This condition can not be established inside the formal theory but has to be
checked by system integration tests. Now we reference in @{figure \<open>figfig7\<close>} this
safety-related condition; however, this happens in a context where general \<^emph>\<open>exported constraints\<close>
are listed. \<^isadof>'s checks and establishes that this is legal in the given ontology.
\<close>
text\<open>
\<^item> \<^theory_text>\<open>@{term_ \<open>term\<close> }\<close> parses and type-checks \<open>term\<close> with term antiquotations,
for instance \<^theory_text>\<open>@{term_ \<open>@{cenelec-term \<open>FT\<close>}\<close>}\<close> will parse and check
that \<open>FT\<close> is indeed an instance of the class \<^typ>\<open>cenelec_term\<close>,
\<close>
subsection\<open>A Domain-Specific Ontology: \<^verbatim>\<open>CENELEC_50128\<close>\<close>
(*<*)
ML\<open>val toLaTeX = String.translate (fn c => if c = #"_" then "\\_" else String.implode[c])\<close>
ML\<open>writeln (DOF_core.print_doc_class_tree
@{context} (fn (n,l) => true (* String.isPrefix "technical_report" l
orelse String.isPrefix "Isa_COL" l *))
toLaTeX)\<close>
(*>*)
text\<open> The \<^verbatim>\<open>CENELEC_50128\<close> ontology in \<^theory>\<open>Isabelle_DOF-Ontologies.CENELEC_50128\<close>
is an example of a domain-specific ontology.
It is based on \<^verbatim>\<open>technical_report\<close> since we assume that this kind of format will be most
appropriate for this type of long-and-tedious documents,
%
\begin{center}
\begin{minipage}{.9\textwidth}\footnotesize
\dirtree{%
.0 .
.1 CENELEC\_50128.judgement\DTcomment{...}.
.1 CENELEC\_50128.test\_item\DTcomment{...}.
.2 CENELEC\_50128.test\_case\DTcomment{...}.
.2 CENELEC\_50128.test\_tool\DTcomment{...}.
.2 CENELEC\_50128.test\_result\DTcomment{...}.
.2 CENELEC\_50128.test\_adm\_role\DTcomment{...}.
.2 CENELEC\_50128.test\_environment\DTcomment{...}.
.2 CENELEC\_50128.test\_requirement\DTcomment{...}.
.2 CENELEC\_50128.test\_specification\DTcomment{...}.
.1 CENELEC\_50128.objectives\DTcomment{...}.
.1 CENELEC\_50128.design\_item\DTcomment{...}.
.2 CENELEC\_50128.interface\DTcomment{...}.
.1 CENELEC\_50128.sub\_requirement\DTcomment{...}.
.1 CENELEC\_50128.test\_documentation\DTcomment{...}.
.1 Isa\_COL.text\_element\DTcomment{...}.
.2 CENELEC\_50128.requirement\DTcomment{...}.
.3 CENELEC\_50128.TC\DTcomment{...}.
.3 CENELEC\_50128.FnI\DTcomment{...}.
.3 CENELEC\_50128.SIR\DTcomment{...}.
.3 CENELEC\_50128.CoAS\DTcomment{...}.
.3 CENELEC\_50128.HtbC\DTcomment{...}.
.3 CENELEC\_50128.SILA\DTcomment{...}.
.3 CENELEC\_50128.assumption\DTcomment{...}.
.4 CENELEC\_50128.AC\DTcomment{...}.
.5 CENELEC\_50128.EC\DTcomment{...}.
.6 CENELEC\_50128.SRAC\DTcomment{...}.
.3 CENELEC\_50128.hypothesis\DTcomment{...}.
.4 CENELEC\_50128.security\_hyp\DTcomment{...}.
.3 CENELEC\_50128.safety\_requirement\DTcomment{...}.
.2 CENELEC\_50128.cenelec\_text\DTcomment{...}.
.3 CENELEC\_50128.SWAS\DTcomment{...}.
.3 [...].
.2 scholarly\_paper.text\_section\DTcomment{...}.
.3 scholarly\_paper.technical\DTcomment{...}.
.4 scholarly\_paper.math\_content\DTcomment{...}.
.5 CENELEC\_50128.semi\_formal\_content\DTcomment{...}.
.1 ...
}
\end{minipage}
\end{center}
\<close>
(* TODO : Rearrange ontology hierarchies. *)
subsubsection\<open>Examples\<close>
text\<open>
The category ``exported constraint (EC)'' is, in the file
\<^file>\<open>CENELEC_50128.thy\<close> defined as follows:
@{boxed_theory_text [display]\<open>
doc_class requirement = text_element +
long_name :: "string option"
is_concerned :: "role set"
doc_class assumption = requirement +
assumption_kind :: ass_kind <= informal
doc_class AC = assumption +
is_concerned :: "role set" <= "UNIV"
doc_class EC = AC +
assumption_kind :: ass_kind <= (*default *) formal
\<close>}
\<close>
text\<open>
We now define the document representations, in the file
\<^file>\<open>DOF-CENELEC_50128.sty\<close>. Let us assume that we want to
register the definition of EC's in a dedicated table of contents (\<^boxed_latex>\<open>tos\<close>)
and use an earlier defined environment \inlineltx|\begin{EC}...\end{EC}| for their graphical
representation. Note that the \inlineltx|\newisadof{}[]{}|-command requires the
full-qualified names, \<^eg>, \<^boxed_theory_text>\<open>text.CENELEC_50128.EC\<close> for the document class and
\<^boxed_theory_text>\<open>CENELEC_50128.requirement.long_name\<close> for the attribute \<^const>\<open>long_name\<close>,
inherited from the document class \<^typ>\<open>requirement\<close>. The representation of \<^typ>\<open>EC\<close>'s
can now be defined as follows:
% TODO:
% Explain the text qualifier of the long_name text.CENELEC_50128.EC
\begin{ltx}
\newisadof{text.CENELEC_50128.EC}%
[label=,type=%
,Isa_COL.text_element.level=%
,Isa_COL.text_element.referentiable=%
,Isa_COL.text_element.variants=%
,CENELEC_50128.requirement.is_concerned=%
,CENELEC_50128.requirement.long_name=%
,CENELEC_50128.EC.assumption_kind=][1]{%
\begin{isamarkuptext}%
\ifthenelse{\equal{\commandkey{CENELEC_50128.requirement.long_name}}{}}{%
% If long_name is not defined, we only create an entry in the table tos
% using the auto-generated number of the EC
\begin{EC}%
\addxcontentsline{tos}{chapter}[]{\autoref{\commandkey{label}}}%
}{%
% If long_name is defined, we use the long_name as title in the
% layout of the EC, in the table "tos" and as index entry. .
\begin{EC}[\commandkey{CENELEC_50128.requirement.long_name}]%
\addxcontentsline{toe}{chapter}[]{\autoref{\commandkey{label}}: %
\commandkey{CENELEC_50128.requirement.long_name}}%
\DOFindex{EC}{\commandkey{CENELEC_50128.requirement.long_name}}%
}%
\label{\commandkey{label}}% we use the label attribute as anchor
#1% The main text of the EC
\end{EC}
\end{isamarkuptext}%
}
\end{ltx}
\<close>
text\<open>
For example, the @{docitem "ass123"} is mapped to
@{boxed_latex [display]
\<open>\begin{isamarkuptext*}%
[label = {ass122},type = {CENELEC_50128.SRAC},
args={label = {ass122}, type = {CENELEC_50128.SRAC},
CENELEC_50128.EC.assumption_kind = {formal}}
] The overall sampling frequence of the odometer subsystem is therefore
14 khz, which includes sampling, computing and result communication
times ...
\end{isamarkuptext*}\<close>}
This environment is mapped to a plain \<^LaTeX> command via:
@{boxed_latex [display]
\<open> \NewEnviron{isamarkuptext*}[1][]{\isaDof[env={text},#1]{\BODY}} \<close>}
\<close>
text\<open>
For the command-based setup, \<^isadof> provides a dispatcher that selects the most specific
implementation for a given \<^boxed_theory_text>\<open>doc_class\<close>:
@{boxed_latex [display]
\<open>%% The Isabelle/DOF dispatcher:
\newkeycommand+[\|]\isaDof[env={UNKNOWN},label=,type={dummyT},args={}][1]{%
\ifcsname isaDof.\commandkey{type}\endcsname%
\csname isaDof.\commandkey{type}\endcsname%
[label=\commandkey{label},\commandkey{args}]{#1}%
\else\relax\fi%
\ifcsname isaDof.\commandkey{env}.\commandkey{type}\endcsname%
\csname isaDof.\commandkey{env}.\commandkey{type}\endcsname%
[label=\commandkey{label},\commandkey{args}]{#1}%
\else%
\message{Isabelle/DOF: Using default LaTeX representation for concept %
"\commandkey{env}.\commandkey{type}".}%
\ifcsname isaDof.\commandkey{env}\endcsname%
\csname isaDof.\commandkey{env}\endcsname%
[label=\commandkey{label}]{#1}%
\else%
\errmessage{Isabelle/DOF: No LaTeX representation for concept %
"\commandkey{env}.\commandkey{type}" defined and no default %
definition for "\commandkey{env}" available either.}%
\fi%
\fi%
}\<close>}
\<close>
(*<<*)
end
(*>>*)

View File

@ -0,0 +1,220 @@
%% Copyright (C) 2019 University of Exeter
%% 2018 University of Paris-Saclay
%% 2018 The University of Sheffield
%%
%% License:
%% This program can be redistributed and/or modified under the terms
%% of the LaTeX Project Public License Distributed from CTAN
%% archives in directory macros/latex/base/lppl.txt; either
%% version 1.3c of the License, or (at your option) any later version.
%% OR
%% The 2-clause BSD-style license.
%%
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
\NeedsTeXFormat{LaTeX2e}\relax
\ProvidesPackage{DOF-cenelec_50128}
[00/00/0000 Document-Type Support Framework for Isabelle (CENELEC 50128).]
\RequirePackage{DOF-COL}
\usepackage{etex}
\ifdef{\reserveinserts}{\reserveinserts{28}}{}
\usepackage[many]{tcolorbox}
\usepackage{marginnote}
% Index setup
\usepackage{index}
\makeindex
\AtEndDocument{\printindex}
\newcommand{\DOFindex}[2]{%
\marginnote{\normalfont\textbf{#1}: #2}%
\expandafter\index\expandafter{\expanded{#2 (#1)}}%
}%
%% SRAC
\providecolor{SRAC}{named}{green}
\ifcsdef{DeclareNewTOC}{%
\DeclareNewTOC[%
owner=\jobname,
type=SRAC,%
types=SRACs,%
listname={List of SRACs}%
]{tos}
\setuptoc{tos}{chapteratlist}
\AtEndEnvironment{frontmatter}{\listofSRACs}
}{}
\newtheorem{SRAC}{SRAC}
\tcolorboxenvironment{SRAC}{
boxrule=0pt
,boxsep=0pt
,colback={white!90!SRAC}
,enhanced jigsaw
,borderline west={2pt}{0pt}{SRAC}
,sharp corners
,before skip=10pt
,after skip=10pt
,breakable
}
\newcommand{\SRACautorefname}{SRAC}
\newisadof{textDOTCENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRAC}%
[label=,type=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTisUNDERSCOREconcerned=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTformalUNDERSCORErepr=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTassumptionUNDERSCOREkind=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTECDOTassumptionUNDERSCOREkind=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTassumptionDOTassumptionUNDERSCOREkind=%
][1]{%
\begin{isamarkuptext}%
\ifthenelse{\equal{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}{}}{%
\begin{SRAC}%
\addxcontentsline{tos}{chapter}[]{\autoref{\commandkey{label}}}%
}{%
\begin{SRAC}[\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}]%
\addxcontentsline{tos}{chapter}[]{\autoref{\commandkey{label}}: \commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
\DOFindex{SRAC}{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
}\label{\commandkey{label}}%
#1%
\end{SRAC}
\end{isamarkuptext}%
}
% EC
\providecolor{EC}{named}{blue}
\ifcsdef{DeclareNewTOC}{%
\DeclareNewTOC[%
owner=\jobname,
type=EC,%
types=ECs,%
listname={List of ECs}%
]{toe}
\setuptoc{toe}{chapteratlist}
\AtEndEnvironment{frontmatter}{\listofECs}
}{}
\newtheorem{EC}{EC}
\tcolorboxenvironment{EC}{
boxrule=0pt
,boxsep=0pt
,colback={white!90!EC}
,enhanced jigsaw
,borderline west={2pt}{0pt}{EC}
,sharp corners
,before skip=10pt
,after skip=10pt
,breakable
}
\newcommand{\ECautorefname}{EC}
\newisadof{textDOTCENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTEC}%
[label=,type=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTisUNDERSCOREconcerned=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTformalUNDERSCORErepr=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTassumptionUNDERSCOREkind=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTECDOTassumptionUNDERSCOREkind=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTassumptionDOTassumptionUNDERSCOREkind=%
][1]{%
\begin{isamarkuptext}%
\ifthenelse{\equal{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}{}}{%
\begin{EC}%
\addxcontentsline{toe}{chapter}[]{\autoref{\commandkey{label}}}%
}{%
\begin{EC}[\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}]%
\addxcontentsline{toe}{chapter}[]{\autoref{\commandkey{label}}: \commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
\DOFindex{EC}{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
}\label{\commandkey{label}}%
#1%
\end{EC}
\end{isamarkuptext}%
}
% assumptions
\providecolor{assumption}{named}{orange}
\newtheorem{assumption}{assumption}
\tcolorboxenvironment{assumption}{
boxrule=0pt
,boxsep=0pt
,colback={white!90!assumption}
,enhanced jigsaw
,borderline west={2pt}{0pt}{assumption}
,sharp corners
,before skip=10pt
,after skip=10pt
,breakable
}
\newcommand{\assumptionautorefname}{assumption}
\newisadof{textDOTCENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTassumption}%
[label=,type=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTisUNDERSCOREconcerned=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTformalUNDERSCORErepr=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTassumptionUNDERSCOREkind=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTassumptionDOTassumptionUNDERSCOREkind=%
][1]{%
\begin{isamarkuptext}%
\ifthenelse{\equal{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}{}}{%
\begin{assumption}%
}{%
\begin{assumption}[\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}]%
\DOFindex{assumption}{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
}\label{\commandkey{label}}%
#1%
\end{assumption}
\end{isamarkuptext}%
}
% hypotheses
\providecolor{hypothesis}{named}{teal}
\newtheorem{hypothesis}{hypothesis}
\tcolorboxenvironment{hypothesis}{
,boxrule=0pt
,boxsep=0pt
,colback={white!90!hypothesis}
,enhanced jigsaw
,borderline west={2pt}{0pt}{hypothesis}
,sharp corners
,before skip=10pt
,after skip=10pt
,breakable
}
\newcommand{\hypothesisautorefname}{hypothesis}
\newisadof{textDOTCENELECUNDERSCOREFIVEZEROONETWOEIGHTDOThypothesis}%
[label=,type=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=%
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTisUNDERSCOREconcerned=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTformalUNDERSCORErepr=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOThypothesisUNDERSCOREkind=%
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOThypothesisDOThypUNDERSCOREtype=%
][1]{%
\begin{isamarkuptext}%
\ifthenelse{\equal{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}{}}{%
\begin{hypothesis}%
}{%
\begin{hypothesis}[\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}]%
\DOFindex{hypothesis}{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
}\label{\commandkey{label}}%
#1%
\end{hypothesis}
\end{isamarkuptext}%
}

View File

@ -11,7 +11,7 @@
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
section\<open>A conceptual introduction into DOF and its features:\<close>
chapter\<open>A conceptual introduction into DOF and its features:\<close>
theory
Conceptual
@ -20,23 +20,25 @@ imports
"Isabelle_DOF.Isa_COL"
begin
section\<open>Excursion: On the semantic consequences of this definition: \<close>
text\<open>Consider the following document class definition and its consequences:\<close>
doc_class A =
level :: "int option"
x :: int
subsection\<open>Excursion: On the semantic consequences of this definition: \<close>
text\<open>This class definition leads an implicit Isabelle/HOL \<^theory_text>\<open>record\<close> definition
(cf. \<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>, chapter 11.6.).
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, chapter 11.6.).
Consequently, \<^theory_text>\<open>doc_class\<close>'es inherit the entire theory-infrastructure from Isabelle records:
\<^enum> there is a HOL-type \<^typ>\<open>A\<close> and its extensible version \<^typ>\<open>'a A_scheme\<close>
\<^enum> there are HOL-terms representing \<^emph>\<open>doc_class instances\<close> with the high-level syntax:
\<^enum> there are HOL-terms representing \<^emph>\<open>doc\_class instances\<close> with the high-level syntax:
\<^enum> \<^term>\<open>undefined\<lparr>level := Some (1::int), x := 5::int \<rparr> :: A\<close>
(Note that this way to construct an instance is not necessarily computable
\<^enum> \<^term>\<open>\<lparr>tag_attribute = X, level = Y, x = Z\<rparr> :: A\<close>
\<^enum> \<^term>\<open>\<lparr>tag_attribute = X, level = Y, x = Z, \<dots> = M\<rparr> :: ('a A_scheme)\<close>
\<^enum> there is an entire proof infra-structure allowing to reason about \<^emph>\<open>doc_class instances\<close>;
\<^enum> there is an entire proof infra-structure allowing to reason about \<^emph>\<open>doc\_class instances\<close>;
this involves the constructor, the selectors (representing the \<^emph>\<open>attributes\<close> in OO lingo)
the update functions, the rules to establish equality and, if possible the code generator
setups:
@ -49,23 +51,61 @@ Consequently, \<^theory_text>\<open>doc_class\<close>'es inherit the entire theo
\<^enum> @{thm [display] A.simps(6)}
\<^enum> ...
\<close>
(* the generated theory of the \<^theory_text>\<open>doc_class\<close> A can be inspected, of course, by *)
text\<open>The generated theory of the \<^theory_text>\<open>doc_class\<close> A can be inspected, of course, by:\<close>
find_theorems (60) name:Conceptual name:A
text\<open>A more abstract view on the state of the DOF machine can be found here:\<close>
print_doc_classes
print_doc_items
text\<open>... and an ML-level output:\<close>
ML\<open>
val docitem_tab = DOF_core.get_instances \<^context>;
val isa_transformer_tab = DOF_core.get_isa_transformers \<^context>;
val docclass_tab = DOF_core.get_onto_classes \<^context>;
\<close>
ML\<open>
Name_Space.dest_table docitem_tab;
Name_Space.dest_table isa_transformer_tab;
Name_Space.dest_table docclass_tab;
\<close>
text\<open>... or as ML assertion: \<close>
ML\<open>
@{assert} (Name_Space.dest_table docitem_tab = []);
fun match ("Conceptual.A", (* the long-name *)
DOF_core.Onto_Class {params, name, virtual,inherits_from=NONE,
attribute_decl, rejectS=[],rex=[], invs=[]})
= (Binding.name_of name = "A")
| match _ = false;
@{assert} (exists match (Name_Space.dest_table docclass_tab))
\<close>
text\<open>As a consequence of the theory of the \<^theory_text>\<open>doc_class\<close> \<open>A\<close>, the code-generator setup lets us
evaluate statements such as: \<close>
value\<open> the(A.level (A.make 3 (Some 4) 5)) = 4\<close>
text\<open>And finally, as a consequence of the above semantic construction of \<^theory_text>\<open>doc_class\<close>'es, the internal
text\<open>And further, as a consequence of the above semantic construction of \<^theory_text>\<open>doc_class\<close>'es, the internal
\<open>\<lambda>\<close>-calculus representation of class instances looks as follows:\<close>
ML\<open>
val tt = @{term \<open>the(A.level (A.make 3 (Some 4) 5))\<close>}
@{term \<open>the(A.level (A.make 3 (Some 4) 5))\<close>};
fun match (Const("Option.option.the",_) $
(Const ("Conceptual.A.level",_) $
(Const ("Conceptual.A.make", _) $ u $ v $ w))) = true
|match _ = false;
@{assert} (match @{term \<open>the(A.level (A.make 3 (Some 4) 5))\<close>})
\<close>
text\<open>For the code-generation, we have the following access to values representing class instances:\<close>
text\<open>And finally, via the code-generation, we have the following programmable
access to values representing class instances:\<close>
ML\<open>
val A_make = @{code A.make};
val zero = @{code "0::int"};
@ -75,9 +115,9 @@ val add = @{code "(+) :: int \<Rightarrow> int \<Rightarrow> int"};
A_make zero (SOME one) (add one one)
\<close>
section\<open>Building up a conceptual class hierarchy:\<close>
subsection\<open>An independent class-tree root: \<close>
text\<open>An independent class-tree root: \<close>
doc_class B =
level :: "int option"
@ -89,9 +129,9 @@ text\<open>We may even use type-synonyms for class synonyms ...\<close>
type_synonym XX = B
subsection\<open>Examples of inheritance \<close>
section\<open>Examples of inheritance \<close>
doc_class C = XX +
doc_class C = B +
z :: "A option" <= None (* A LINK, i.e. an attribute that has a type
referring to a document class. Mathematical
relations over document items can be modeled. *)
@ -125,7 +165,7 @@ doc_class F =
and br':: "r \<sigma> \<noteq> [] \<and> length(b' \<sigma>) \<ge> 3"
and cr :: "properties \<sigma> \<noteq> []"
text\<open>The effect of the invariant declaration is to provide intern definitions for validation
text\<open>The effect of the invariant declaration is to provide intern HOL definitions for validation
functions of this invariant. They can be referenced as follows:\<close>
thm br_inv_def
thm br'_inv_def
@ -133,7 +173,7 @@ thm cr_inv_def
term "\<lparr>F.tag_attribute = 5, properties = [], r = [], u = undefined, s = [], b = {}, b' = []\<rparr>"
term "br' (\<lparr>F.tag_attribute = 5, properties = [], r = [], u = undefined, s = [], b = {}, b' = []\<rparr>) "
term "br'_inv (\<lparr>F.tag_attribute = 5, properties = [], r = [], u = undefined, s = [], b = {}, b' = []\<rparr>) "
text\<open>Now, we can use these definitions in order to generate code for these validation functions.
Note, however, that not everything that we can write in an invariant (basically: HOL) is executable,
@ -141,7 +181,7 @@ or even compilable by the code generator setup:\<close>
ML\<open> val cr_inv_code = @{code "cr_inv"} \<close> \<comment> \<open>works albeit thm is abstract ...\<close>
text\<open>while in :\<close>
(* ML\<open> val br_inv_code = @{code "br_inv"} \<close> \<comment>\<open>this does not work ...\<close> *)
ML\<open> val br_inv_code = @{code "br_inv"} \<close> \<comment>\<open>this does not work ...\<close>
text\<open>... the compilation fails due to the fact that nothing prevents the user
to define an infinite relation between \<^typ>\<open>A\<close> and \<^typ>\<open>C\<close>. However, the alternative
@ -151,30 +191,31 @@ ML\<open> val br'_inv_code = @{code "br'_inv"} \<close> \<comment> \<open>does w
text\<open>... is compilable ...\<close>
doc_class G = C +
g :: "thm" <= "@{thm \<open>HOL.refl\<close>}"
g :: "thm" <= "@{thm \<open>HOL.refl\<close>}" (* warning overriding attribute expected*)
doc_class M =
ok :: "unit"
accepts "A ~~ \<lbrace>C || D\<rbrace>\<^sup>* ~~ \<lbrakk>F\<rbrakk>"
text\<open>The final class and item tables look like this:\<close>
print_doc_classes
print_doc_items
(*
ML\<open> Document.state();\<close>
ML\<open> Session.get_keywords(); (* this looks to be really session global. *)
Outer_Syntax.command; \<close>
ML\<open> Thy_Header.get_keywords @{theory};(* this looks to be really theory global. *) \<close>
*)
ML\<open>
map fst (Name_Space.dest_table (DOF_core.get_onto_classes \<^context>));
open_monitor*[aaa::M]
section*[test::A]\<open>Test and Validation\<close>
term\<open>Conceptual.M.make\<close>
text\<open>Defining some document elements to be referenced in later on in another theory: \<close>
text*[sdf]\<open> Lorem ipsum @{thm refl}\<close>
text*[ sdfg :: F] \<open> Lorem ipsum @{thm refl}\<close>
text*[ xxxy ] \<open> Lorem ipsum @{F \<open>sdfg\<close>} rate @{thm refl}\<close>
close_monitor*[aaa]
let val class_ids_so_far = ["Conceptual.A", "Conceptual.B", "Conceptual.C", "Conceptual.D",
"Conceptual.E", "Conceptual.F", "Conceptual.G", "Conceptual.M",
"Isa_COL.float", "Isa_COL.frame", "Isa_COL.figure", "Isa_COL.chapter",
"Isa_COL.listing", "Isa_COL.section", "Isa_COL.paragraph",
"Isa_COL.subsection", "Isa_COL.text_element", "Isa_COL.subsubsection"]
val docclass_tab = map fst (Name_Space.dest_table (DOF_core.get_onto_classes \<^context>));
in @{assert} (class_ids_so_far = docclass_tab) end\<close>
end
section\<open>For Test and Validation\<close>
text*[sdf] \<open> Lorem ipsum ... \<close> \<comment> \<open>anonymous reference\<close>
text*[sdfg :: F] \<open> Lorem ipsum ...\<close> \<comment> \<open>some F instance \<close>
end

View File

@ -0,0 +1,23 @@
session "Isabelle_DOF-Ontologies" = "Isabelle_DOF" +
options [document = pdf, document_output = "output", document_build = dof]
directories
"CC_v3_1_R5"
"Conceptual"
"small_math"
"CENELEC_50128"
theories
"document_setup"
"document_templates"
"CC_v3_1_R5/CC_v3_1_R5"
"CC_v3_1_R5/CC_terminology"
"Conceptual/Conceptual"
"small_math/small_math"
"CENELEC_50128/CENELEC_50128"
"CENELEC_50128/CENELEC_50128_Documentation"
document_files
"root.bib"
"lstisadof-manual.sty"
"preamble.tex"
"figures/antiquotations-PIDE.png"
"figures/srac-as-es-application.png"
"figures/srac-definition.png"

View File

@ -0,0 +1,65 @@
%% Copyright (c) University of Exeter
%% University of Paris-Saclay
%%
%% License:
%% This program can be redistributed and/or modified under the terms
%% of the LaTeX Project Public License Distributed from CTAN
%% archives in directory macros/latex/base/lppl.txt; either
%% version 1.3c of the License, or (at your option) any later version.
%% OR
%% The 2-clause BSD-style license.
%%
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
%% Warning: Do Not Edit!
%% =====================
%% This is the root file for the Isabelle/DOF using the scrartcl class.
%%
%% All customization and/or additional packages should be added to the file
%% preamble.tex.
\RequirePackage{ifvtex}
\documentclass[16x9,9pt]{beamer}
\PassOptionsToPackage{force}{DOF-scholarly_paper}
\title{No Title Given}
\usepackage{DOF-core}
\usepackage{textcomp}
\bibliographystyle{abbrvnat}
\RequirePackage{subcaption}
\providecommand{\institute}[1]{}%
\providecommand{\inst}[1]{}%
\providecommand{\orcidID}[1]{}%
\providecommand{\email}[1]{}%
\usepackage[numbers, sort&compress, sectionbib]{natbib}
\usepackage{hyperref}
\setcounter{tocdepth}{3}
\hypersetup{%
bookmarksdepth=3
,pdfpagelabels
,pageanchor=true
,bookmarksnumbered
,plainpages=false
} % more detailed digital TOC (aka bookmarks)
\sloppy
\allowdisplaybreaks[4]
\newenvironment{frontmatter}{}{}
\raggedbottom
\begin{document}
\begin{frame}
\maketitle
\end{frame}
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
% optional bibliography
\IfFileExists{root.bib}{{\bibliography{root}}}{}
\end{document}
%%% Local Variables:
%%% mode: latex
%%% TeX-master: t
%%% End:

View File

@ -0,0 +1,65 @@
%% Copyright (c) University of Exeter
%% University of Paris-Saclay
%%
%% License:
%% This program can be redistributed and/or modified under the terms
%% of the LaTeX Project Public License Distributed from CTAN
%% archives in directory macros/latex/base/lppl.txt; either
%% version 1.3c of the License, or (at your option) any later version.
%% OR
%% The 2-clause BSD-style license.
%%
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
%% Warning: Do Not Edit!
%% =====================
%% This is the root file for the Isabelle/DOF using the scrartcl class.
%%
%% All customization and/or additional packages should be added to the file
%% preamble.tex.
\RequirePackage{ifvtex}
\documentclass[]{beamer}
\PassOptionsToPackage{force}{DOF-scholarly_paper}
\title{No Title Given}
\usepackage{beamerposter}
\usepackage{DOF-core}
\usepackage{textcomp}
\bibliographystyle{abbrvnat}
\RequirePackage{subcaption}
\providecommand{\institute}[1]{}%
\providecommand{\inst}[1]{}%
\providecommand{\orcidID}[1]{}%
\providecommand{\email}[1]{}%
\usepackage[numbers, sort&compress, sectionbib]{natbib}
\usepackage{hyperref}
\setcounter{tocdepth}{3}
\hypersetup{%
bookmarksdepth=3
,pdfpagelabels
,pageanchor=true
,bookmarksnumbered
,plainpages=false
} % more detailed digital TOC (aka bookmarks)
\sloppy
\allowdisplaybreaks[4]
\newenvironment{frontmatter}{}{}
\raggedbottom
\begin{document}
\begin{frame}[fragile]
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
% optional bibliography
\IfFileExists{root.bib}{{\bibliography{root}}}{}
\end{frame}
\end{document}
%%% Local Variables:
%%% mode: latex
%%% TeX-master: t
%%% End:

View File

@ -23,6 +23,7 @@
%% preamble.tex.
\documentclass[submission,copyright,creativecommons]{eptcs}
\title{No Title Given}
\usepackage{DOF-core}
\bibliographystyle{eptcs}% the mandatory bibstyle
@ -66,7 +67,7 @@
\begin{document}
\maketitle
\input{session}
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
% optional bibliography
\IfFileExists{root.bib}{%
\bibliography{root}

View File

@ -19,14 +19,13 @@
%% you need to download lipics.cls from
%% https://www.dagstuhl.de/en/publications/lipics/instructions-for-authors/
%% and add it manually to the praemble.tex and the ROOT file.
%% Moreover, the option "document_comment_latex=true" needs to be set
%% in the ROOT file.
%%
%% All customization and/or additional packages should be added to the file
%% preamble.tex.
\documentclass[a4paper,UKenglish,cleveref, autoref,thm-restate]{lipics-v2021}
\bibliographystyle{plainurl}% the mandatory bibstyle
\title{No Title Given}
\usepackage[numbers, sort&compress, sectionbib]{natbib}
\usepackage{DOF-core}
@ -64,7 +63,7 @@
\begin{document}
\maketitle
\input{session}
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
% optional bibliography
\IfFileExists{root.bib}{%
\small

View File

@ -0,0 +1,67 @@
%% Copyright (c) 2019-2022 University of Exeter
%% 2018-2022 University of Paris-Saclay
%% 2018-2019 The University of Sheffield
%%
%% License:
%% This program can be redistributed and/or modified under the terms
%% of the LaTeX Project Public License Distributed from CTAN
%% archives in directory macros/latex/base/lppl.txt; either
%% version 1.3c of the License, or (at your option) any later version.
%% OR
%% The 2-clause BSD-style license.
%%
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
%% Warning: Do Not Edit!
%% =====================
%% This is the root file for the Isabelle/DOF using the scrartcl class.
%%
%% All customization and/or additional packages should be added to the file
%% preamble.tex.
\documentclass[iicol]{sn-jnl}
\PassOptionsToPackage{force}{DOF-scholarly_paper}
\title{No Title Given}
\usepackage{DOF-core}
\bibliographystyle{sn-basic}
\let\proof\relax
\let\endproof\relax
\newcommand{\institute}[1]{}
\usepackage{manyfoot}
\usepackage{DOF-core}
\setcounter{tocdepth}{3}
\hypersetup{%
bookmarksdepth=3
,pdfpagelabels
,pageanchor=true
,bookmarksnumbered
,plainpages=false
} % more detailed digital TOC (aka bookmarks)
\sloppy
\allowdisplaybreaks[4]
\usepackage{subcaption}
\usepackage[size=footnotesize]{caption}
\let\DOFauthor\relax
\begin{document}
\selectlanguage{USenglish}%
\renewcommand{\bibname}{References}%
\renewcommand{\figurename}{Fig.}
\renewcommand{\abstractname}{Abstract.}
\renewcommand{\subsubsectionautorefname}{Sect.}
\renewcommand{\subsectionautorefname}{Sect.}
\renewcommand{\sectionautorefname}{Sect.}
\renewcommand{\figureautorefname}{Fig.}
\newcommand{\lstnumberautorefname}{Line}
\maketitle
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
% optional bibliography
\IfFileExists{root.bib}{{\bibliography{root}}}{}
\end{document}
%%% Local Variables:
%%% mode: latex
%%% TeX-master: t
%%% End:

View File

@ -23,6 +23,7 @@
\RequirePackage{fix-cm}
\documentclass[]{svjour3}
\title{No Title Given}
\usepackage{DOF-core}
\usepackage{mathptmx}
\bibliographystyle{abbrvnat}
@ -40,8 +41,6 @@
} % more detailed digital TOC (aka bookmarks)
\sloppy
\allowdisplaybreaks[4]
\usepackage[caption]{subfig}
\usepackage[size=footnotesize]{caption}
\begin{document}
\selectlanguage{USenglish}%
@ -52,12 +51,10 @@
\renewcommand{\subsectionautorefname}{Sect.}
\renewcommand{\sectionautorefname}{Sect.}
\renewcommand{\figureautorefname}{Fig.}
\newcommand{\subtableautorefname}{\tableautorefname}
\newcommand{\subfigureautorefname}{\figureautorefname}
\newcommand{\lstnumberautorefname}{Line}
\maketitle
\input{session}
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
% optional bibliography
\IfFileExists{root.bib}{{\bibliography{root}}}{}
\end{document}

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 67 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 50 KiB

View File

@ -0,0 +1,327 @@
%% Copyright (C) 2018 The University of Sheffield
%% 2018-2021 The University of Paris-Saclay
%% 2019-2021 The University of Exeter
%%
%% License:
%% This program can be redistributed and/or modified under the terms
%% of the LaTeX Project Public License Distributed from CTAN
%% archives in directory macros/latex/base/lppl.txt; either
%% version 1.3c of the License, or (at your option) any later version.
%% OR
%% The 2-clause BSD-style license.
%%
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
\usepackage{listings}
\usepackage{listingsutf8}
\usepackage{tikz}
\usepackage[many]{tcolorbox}
\tcbuselibrary{listings}
\tcbuselibrary{skins}
\usepackage{xstring}
\definecolor{OliveGreen} {cmyk}{0.64,0,0.95,0.40}
\definecolor{BrickRed} {cmyk}{0,0.89,0.94,0.28}
\definecolor{Blue} {cmyk}{1,1,0,0}
\definecolor{CornflowerBlue}{cmyk}{0.65,0.13,0,0}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% <antiquotations>
%% Hack: re-defining tag types for supporting highlighting of antiquotations
\gdef\lst@tagtypes{s}
\gdef\lst@TagKey#1#2{%
\lst@Delim\lst@tagstyle #2\relax
{Tag}\lst@tagtypes #1%
{\lst@BeginTag\lst@EndTag}%
\@@end\@empty{}}
\lst@Key{tag}\relax{\lst@TagKey\@empty{#1}}
\lst@Key{tagstyle}{}{\def\lst@tagstyle{#1}}
\lst@AddToHook{EmptyStyle}{\let\lst@tagstyle\@empty}
\gdef\lst@BeginTag{%
\lst@DelimOpen
\lst@ifextags\else
{\let\lst@ifkeywords\iftrue
\lst@ifmarkfirstintag \lst@firstintagtrue \fi}}
\lst@AddToHookExe{ExcludeDelims}{\let\lst@ifextags\iffalse}
\gdef\lst@EndTag{\lst@DelimClose\lst@ifextags\else}
\lst@Key{usekeywordsintag}t[t]{\lstKV@SetIf{#1}\lst@ifusekeysintag}
\lst@Key{markfirstintag}f[t]{\lstKV@SetIf{#1}\lst@ifmarkfirstintag}
\gdef\lst@firstintagtrue{\global\let\lst@iffirstintag\iftrue}
\global\let\lst@iffirstintag\iffalse
\lst@AddToHook{PostOutput}{\lst@tagresetfirst}
\lst@AddToHook{Output}
{\gdef\lst@tagresetfirst{\global\let\lst@iffirstintag\iffalse}}
\lst@AddToHook{OutputOther}{\gdef\lst@tagresetfirst{}}
\lst@AddToHook{Output}
{\ifnum\lst@mode=\lst@tagmode
\lst@iffirstintag \let\lst@thestyle\lst@gkeywords@sty \fi
\lst@ifusekeysintag\else \let\lst@thestyle\lst@gkeywords@sty\fi
\fi}
\lst@NewMode\lst@tagmode
\gdef\lst@Tag@s#1#2\@empty#3#4#5{%
\lst@CArg #1\relax\lst@DefDelimB {}{}%
{\ifnum\lst@mode=\lst@tagmode \expandafter\@gobblethree \fi}%
#3\lst@tagmode{#5}%
\lst@CArg #2\relax\lst@DefDelimE {}{}{}#4\lst@tagmode}%
\gdef\lst@BeginCDATA#1\@empty{%
\lst@TrackNewLines \lst@PrintToken
\lst@EnterMode\lst@GPmode{}\let\lst@ifmode\iffalse
\lst@mode\lst@tagmode #1\lst@mode\lst@GPmode\relax\lst@modetrue}
%
\def\beginlstdelim#1#2#3%
{%
\def\endlstdelim{\texttt{\textbf{\color{black!60}#2}}\egroup}%
\ttfamily\textbf{\color{black!60}#1}\bgroup\rmfamily\color{#3}\aftergroup\endlstdelim%
}
%% </antiquotations>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% <isar>
\providecolor{isar}{named}{blue}
\renewcommand{\isacommand}[1]{\textcolor{OliveGreen!60}{\ttfamily\bfseries #1}}
\newcommand{\inlineisarbox}[1]{#1}
\NewTColorBox[]{isarbox}{}{
,boxrule=0pt
,boxsep=0pt
,colback=white!90!isar
,enhanced jigsaw
,borderline west={2pt}{0pt}{isar!60!black}
,sharp corners
%,before skip balanced=0.5\baselineskip plus 2pt % works only with Tex Live 2020 and later
,enlarge top by=0mm
,enhanced
,overlay={\node[draw,fill=isar!60!black,xshift=0pt,anchor=north
east,font=\bfseries\footnotesize\color{white}]
at (frame.north east) {Isar};}
}
%% </isar>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% <out>
\providecolor{out}{named}{green}
\newtcblisting{out}[1][]{%
listing only%
,boxrule=0pt
,boxsep=0pt
,colback=white!90!out
,enhanced jigsaw
,borderline west={2pt}{0pt}{out!60!black}
,sharp corners
% ,before skip=10pt
% ,after skip=10pt
,enlarge top by=0mm
,enhanced
,overlay={\node[draw,fill=out!60!black,xshift=0pt,anchor=north
east,font=\bfseries\footnotesize\color{white}]
at (frame.north east) {Document};}
,listing options={
breakatwhitespace=true
,columns=flexible%
,basicstyle=\small\rmfamily
,mathescape
,#1
}
}%
%% </out>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% <sml>
\lstloadlanguages{ML}
\providecolor{sml}{named}{red}
\lstdefinestyle{sml}{
,escapechar=ë%
,basicstyle=\ttfamily%
,commentstyle=\itshape%
,keywordstyle=\bfseries\color{CornflowerBlue}%
,ndkeywordstyle=\color{green}%
,language=ML
% ,literate={%
% {<@>}{@}1%
% }
,keywordstyle=[6]{\itshape}%
,morekeywords=[6]{args_type}%
,tag=**[s]{@\{}{\}}%
,tagstyle=\color{CornflowerBlue}%
,markfirstintag=true%
}%
\def\inlinesml{\lstinline[style=sml,breaklines=true,breakatwhitespace=true]}
\newtcblisting{sml}[1][]{%
listing only%
,boxrule=0pt
,boxsep=0pt
,colback=white!90!sml
,enhanced jigsaw
,borderline west={2pt}{0pt}{sml!60!black}
,sharp corners
% ,before skip=10pt
% ,after skip=10pt
,enlarge top by=0mm
,enhanced
,overlay={\node[draw,fill=sml!60!black,xshift=0pt,anchor=north
east,font=\bfseries\footnotesize\color{white}]
at (frame.north east) {SML};}
,listing options={
style=sml
,columns=flexible%
,basicstyle=\small\ttfamily
,#1
}
}%
%% </sml>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% <latex>
\lstloadlanguages{TeX}
\providecolor{ltx}{named}{yellow}
\lstdefinestyle{lltx}{language=[AlLaTeX]TeX,
,basicstyle=\ttfamily%
,showspaces=false%
,escapechar=ë
,showlines=false%
,morekeywords={newisadof}
% ,keywordstyle=\bfseries%
% Defining 2-keywords
,keywordstyle=[1]{\color{BrickRed!60}\bfseries}%
% Defining 3-keywords
,keywordstyle=[2]{\color{OliveGreen!60}\bfseries}%
% Defining 4-keywords
,keywordstyle=[3]{\color{black!60}\bfseries}%
% Defining 5-keywords
,keywordstyle=[4]{\color{Blue!70}\bfseries}%
% Defining 6-keywords
,keywordstyle=[5]{\itshape}%
%
}
\lstdefinestyle{ltx}{style=lltx,
basicstyle=\ttfamily\small}%
\def\inlineltx{\lstinline[style=ltx, breaklines=true,columns=fullflexible]}
% see
% https://tex.stackexchange.com/questions/247643/problem-with-tcblisting-first-listed-latex-command-is-missing
\NewTCBListing{ltx}{ !O{} }{%
listing only%
,boxrule=0pt
,boxsep=0pt
,colback=white!90!ltx
,enhanced jigsaw
,borderline west={2pt}{0pt}{ltx!60!black}
,sharp corners
% ,before skip=10pt
% ,after skip=10pt
,enlarge top by=0mm
,enhanced
,overlay={\node[draw,fill=ltx!60!black,xshift=0pt,anchor=north
east,font=\bfseries\footnotesize\color{white}]
at (frame.north east) {\LaTeX};}
,listing options={
style=lltx,
,columns=flexible%
,basicstyle=\small\ttfamily
,#1
}
}%
%% </latex>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% <bash>
\providecolor{bash}{named}{black}
\lstloadlanguages{bash}
\lstdefinestyle{bash}{%
language=bash
,escapechar=ë
,basicstyle=\ttfamily%
,showspaces=false%
,showlines=false%
,columns=flexible%
% ,keywordstyle=\bfseries%
% Defining 2-keywords
,keywordstyle=[1]{\color{BrickRed!60}\bfseries}%
% Defining 3-keywords
,keywordstyle=[2]{\color{OliveGreen!60}\bfseries}%
% Defining 4-keywords
,keywordstyle=[3]{\color{black!60}\bfseries}%
% Defining 5-keywords
,keywordstyle=[4]{\color{Blue!80}\bfseries}%
,alsoletter={*,-,:,~,/}
,morekeywords=[4]{}%
% Defining 6-keywords
,keywordstyle=[5]{\itshape}%
%
}
\def\inlinebash{\lstinline[style=bash, breaklines=true,columns=fullflexible]}
\newcommand\@isabsolutepath[3]{%
\StrLeft{#1}{1}[\firstchar]%
\IfStrEq{\firstchar}{/}{#2}{#3}%
}
\newcommand{\@homeprefix}[1]{%
\ifthenelse{\equal{#1}{}}{\textasciitilde}{\textasciitilde/}%
}
\newcommand{\prompt}[1]{%
\color{Blue!80}\textbf{\texttt{%
achim@logicalhacking:{\@isabsolutepath{#1}{#1}{\@homeprefix{#1}#1}}\$}}%
}
\newtcblisting{bash}[1][]{%
listing only%
,boxrule=0pt
,boxsep=0pt
,colback=white!90!bash
,enhanced jigsaw
,borderline west={2pt}{0pt}{bash!60!black}
,sharp corners
% ,before skip=10pt
% ,after skip=10pt
,enlarge top by=0mm
,enhanced
,overlay={\node[draw,fill=bash!60!black,xshift=0pt,anchor=north
east,font=\bfseries\footnotesize\color{white}]
at (frame.north east) {Bash};}
,listing options={
style=bash
,columns=flexible%
,breaklines=true%
,prebreak=\mbox{\space\textbackslash}%
,basicstyle=\small\ttfamily%
,#1
}
}%
%% </bash>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% <config>
\providecolor{config}{named}{gray}
\newtcblisting{config}[2][]{%
listing only%
,boxrule=0pt
,boxsep=0pt
,colback=white!90!config
,enhanced jigsaw
,borderline west={2pt}{0pt}{config!60!black}
,sharp corners
% ,before skip=10pt
% ,after skip=10pt
,enlarge top by=0mm
,enhanced
,overlay={\node[draw,fill=config!60!black,xshift=0pt,anchor=north
east,font=\bfseries\footnotesize\color{white}]
at (frame.north east) {#2};}
,listing options={
breakatwhitespace=true
,columns=flexible%
,basicstyle=\small\ttfamily
,mathescape
,#1
}
}%
%% </config>
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

View File

@ -0,0 +1,4 @@
\usepackage{dirtree}
\renewcommand*\DTstylecomment{\ttfamily\itshape}
\usepackage{lstisadof-manual}

View File

@ -451,7 +451,7 @@
journal = {Archive of Formal Proofs},
month = may,
year = 2010,
note = {\url{http://isa-afp.org/entries/Regular-Sets.html}, Formal
note = {\url{https://isa-afp.org/entries/Regular-Sets.html}, Formal
proof development},
issn = {2150-914x}
}
@ -462,7 +462,7 @@
journal = {Archive of Formal Proofs},
month = mar,
year = 2004,
note = {\url{http://isa-afp.org/entries/Functional-Automata.html},
note = {\url{https://isa-afp.org/entries/Functional-Automata.html},
Formal proof development},
issn = {2150-914x}
}

View File

@ -0,0 +1,20 @@
(*<*)
theory "document_setup"
imports
"Isabelle_DOF.technical_report"
"Isabelle_DOF-Ontologies.CENELEC_50128"
"Isabelle_DOF-Ontologies.CC_terminology"
begin
use_template "scrreprt-modern"
use_ontology "Isabelle_DOF.technical_report" and "Isabelle_DOF-Ontologies.CENELEC_50128"
and "Isabelle_DOF-Ontologies.CC_terminology"
(*>*)
title*[title::title] \<open>Isabelle/DOF\<close>
subtitle*[subtitle::subtitle]\<open>Ontologies\<close>
(*<*)
end
(*>*)

View File

@ -0,0 +1,32 @@
(*************************************************************************
* Copyright (C)
* 2019 The University of Exeter
* 2018-2019 The University of Paris-Saclay
* 2018 The University of Sheffield
*
* License:
* This program can be redistributed and/or modified under the terms
* of the 2-clause BSD-style license.
*
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
theory
"document_templates"
imports
"Isabelle_DOF.Isa_DOF"
begin
define_template "./document-templates/root-eptcs-UNSUPPORTED.tex"
"Unsupported template for the EPTCS class. Not for general use."
define_template "./document-templates/root-lipics-v2021-UNSUPPORTED.tex"
"Unsupported template for LIPICS (v2021). Not for general use."
define_template "./document-templates/root-svjour3-UNSUPPORTED.tex"
"Unsupported template for SVJOUR. Not for general use."
define_template "./document-templates/root-sn-article-UNSUPPORTED.tex"
"Unsupported template for Springer Nature's template. Not for general use."
define_template "./document-templates/root-beamer-UNSUPPORTED.tex"
"Unsupported template for presentations. Not for general use."
define_template "./document-templates/root-beamerposter-UNSUPPORTED.tex"
"Unsupported template for poster. Not for general use."
end

View File

@ -11,10 +11,11 @@
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
section\<open>An example ontology for a math paper\<close>
chapter\<open>An example ontology for a math paper\<close>
theory small_math
imports "Isabelle_DOF.Isa_COL"
imports
"Isabelle_DOF.Isa_COL"
begin
doc_class title =
@ -64,18 +65,25 @@ doc_class result = technical +
ML\<open>fun check_invariant_invariant oid {is_monitor:bool} ctxt =
let val kind_term = AttributeAccess.compute_attr_access ctxt "kind" oid @{here} @{here}
val property_termS = AttributeAccess.compute_attr_access ctxt "property" oid @{here} @{here}
ML\<open>
fn thy =>
let fun check_invariant_invariant oid {is_monitor:bool} ctxt =
let val kind_term = ISA_core.compute_attr_access ctxt "kind" oid NONE @{here}
val property_termS = ISA_core.compute_attr_access ctxt "property" oid NONE @{here}
val tS = HOLogic.dest_list property_termS
in case kind_term of
@{term "proof"} => if not(null tS) then true
else error("class class invariant violation")
| _ => false
end
val cid = "result"
val cid_long = DOF_core.get_onto_class_name_global cid thy
val binding = DOF_core.binding_from_onto_class_pos cid thy
in DOF_core.add_ml_invariant binding
(DOF_core.make_ml_invariant (check_invariant_invariant, cid_long)) thy end
\<close>
setup\<open>DOF_core.update_class_invariant "small_math.result" check_invariant_invariant\<close>
(*setup\<open>DOF_core.add_ml_invariant "small_math.result" check_invariant_invariant\<close>*)
doc_class example = technical +
@ -85,7 +93,7 @@ doc_class "conclusion" = text_section +
establish :: "(contribution_claim \<times> result) set"
text\<open> Besides subtyping, there is another relation between
doc_classes: a class can be a \<^emph>\<open>monitor\<close> to other ones,
doc\_classes: a class can be a \<^emph>\<open>monitor\<close> to other ones,
which is expressed by occurrence in the where clause.
While sub-classing refers to data-inheritance of attributes,
a monitor captures structural constraints -- the order --
@ -137,10 +145,10 @@ fun dest_option _ (Const (@{const_name "None"}, _)) = NONE
in
fun check ctxt cidS mon_id pos =
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos @{here}
fun check ctxt cidS mon_id pos_opt =
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos_opt @{here}
val groups = partition (Context.proof_of ctxt) cidS trace
fun get_level_raw oid = AttributeAccess.compute_attr_access ctxt "level" oid @{here} @{here};
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
fun check_level_hd a = case (get_level (snd a)) of
NONE => error("Invariant violation: leading section" ^ snd a ^
@ -164,10 +172,17 @@ end
end
\<close>
setup\<open> let val cidS = ["small_math.introduction","small_math.technical", "small_math.conclusion"];
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid @{here};
setup\<open>
fn thy =>
let val cidS = ["small_math.introduction","small_math.technical", "small_math.conclusion"];
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid NONE;
true)
in DOF_core.update_class_invariant "small_math.article" body end\<close>
val ctxt = Proof_Context.init_global thy
val cid = "article"
val cid_long = DOF_core.get_onto_class_name_global cid thy
val binding = DOF_core.binding_from_onto_class_pos cid thy
in DOF_core.add_ml_invariant binding (DOF_core.make_ml_invariant (body, cid_long)) thy end
\<close>
end

View File

@ -0,0 +1,12 @@
theory Fun_Function
imports "Isabelle_DOF-Proofs.Very_Deep_DOF"
keywords "function*" "termination*" :: thy_goal_defn and
"fun*" :: thy_defn
begin
ML_file "specification.ML"
ML_file "function.ML"
ML_file "fun.ML"
end

10
Isabelle_DOF-Proofs/ROOT Normal file
View File

@ -0,0 +1,10 @@
session "Isabelle_DOF-Proofs" (proofs) = "HOL-Proofs" +
options [document = false, record_proofs = 2, parallel_limit = 500, document_build = dof]
sessions
"Isabelle_DOF"
Metalogic_ProofChecker
theories
Isabelle_DOF.ontologies
Isabelle_DOF.Isa_DOF
Very_Deep_DOF
Reification_Test

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,19 @@
theory Very_Deep_DOF
imports "Isabelle_DOF-Proofs.Very_Deep_Interpretation"
begin
(* tests *)
term "@{typ ''int => int''}"
term "@{term ''Bound 0''}"
term "@{thm ''refl''}"
term "@{docitem ''<doc_ref>''}"
ML\<open> @{term "@{docitem ''<doc_ref>''}"}\<close>
term "@{typ \<open>int \<Rightarrow> int\<close>}"
term "@{term \<open>\<forall>x. P x \<longrightarrow> Q\<close>}"
term "@{thm \<open>refl\<close>}"
term "@{docitem \<open>doc_ref\<close>}"
ML\<open> @{term "@{docitem \<open>doc_ref\<close>}"}\<close>
end

View File

@ -0,0 +1,251 @@
theory Very_Deep_Interpretation
imports "Isabelle_DOF.Isa_COL"
Metalogic_ProofChecker.ProofTerm
begin
subsection\<open> Syntax \<close>
\<comment> \<open>and others in the future : file, http, thy, ...\<close>
(* Delete shallow interpretation notations (mixfixes) of the term anti-quotations,
so we can use them for the deep interpretation *)
no_notation "Isabelle_DOF_typ" ("@{typ _}")
no_notation "Isabelle_DOF_term" ("@{term _}")
no_notation "Isabelle_DOF_thm" ("@{thm _}")
no_notation "Isabelle_DOF_file" ("@{file _}")
no_notation "Isabelle_DOF_thy" ("@{thy _}")
no_notation "Isabelle_DOF_docitem" ("@{docitem _}")
no_notation "Isabelle_DOF_docitem_attr" ("@{docitemattr (_) :: (_)}")
no_notation "Isabelle_DOF_trace_attribute" ("@{trace'_-attribute _}")
consts Isabelle_DOF_typ :: "string \<Rightarrow> typ" ("@{typ _}")
consts Isabelle_DOF_term :: "string \<Rightarrow> term" ("@{term _}")
datatype "thm" = Isabelle_DOF_thm string ("@{thm _}") | Thm_content ("proof":proofterm)
datatype "thms_of" = Isabelle_DOF_thms_of string ("@{thms-of _}")
datatype "file" = Isabelle_DOF_file string ("@{file _}")
datatype "thy" = Isabelle_DOF_thy string ("@{thy _}")
consts Isabelle_DOF_docitem :: "string \<Rightarrow> 'a" ("@{docitem _}")
datatype "docitem_attr" = Isabelle_DOF_docitem_attr string string ("@{docitemattr (_) :: (_)}")
consts Isabelle_DOF_trace_attribute :: "string \<Rightarrow> (string * string) list" ("@{trace'_-attribute _}")
subsection\<open> Semantics \<close>
ML\<open>
structure Meta_ISA_core =
struct
fun ML_isa_check_trace_attribute thy (term, _, pos) s =
let
val oid = (HOLogic.dest_string term
handle TERM(_,[t]) => error ("wrong term format: must be string constant: "
^ Syntax.string_of_term_global thy t ))
val _ = DOF_core.get_instance_global oid thy
in SOME term end
fun reify_typ (Type (s, typ_list)) =
\<^Const>\<open>Ty\<close> $ HOLogic.mk_literal s $ HOLogic.mk_list \<^Type>\<open>typ\<close> (map reify_typ typ_list)
| reify_typ (TFree (name, sort)) =
\<^Const>\<open>Tv\<close> $(\<^Const>\<open>Free\<close> $ HOLogic.mk_literal name)
$ (HOLogic.mk_set \<^typ>\<open>class\<close> (map HOLogic.mk_literal sort))
| reify_typ (TVar (indexname, sort)) =
let val (name, index_value) = indexname
in \<^Const>\<open>Tv\<close>
$ (\<^Const>\<open>Var\<close>
$ HOLogic.mk_prod (HOLogic.mk_literal name, HOLogic.mk_number \<^Type>\<open>int\<close> index_value))
$ (HOLogic.mk_set \<^typ>\<open>class\<close> (map HOLogic.mk_literal sort)) end
fun ML_isa_elaborate_typ (thy:theory) _ _ term_option _ =
case term_option of
NONE => error("Wrong term option. You must use a defined term")
| SOME term => let
val typ_name = HOLogic.dest_string term
val typ = Syntax.read_typ_global thy typ_name
in reify_typ typ end
fun reify_term (Const (name, typ)) =\<^Const>\<open>Ct\<close> $ HOLogic.mk_literal name $ reify_typ typ
| reify_term (Free (name, typ)) =
\<^Const>\<open>Fv\<close> $ (\<^Const>\<open>Free\<close> $ HOLogic.mk_literal name) $ reify_typ typ
| reify_term (Var (indexname, typ)) =
let val (name, index_value) = indexname
in \<^Const>\<open>Fv\<close>
$ (\<^Const>\<open>Var\<close>
$ HOLogic.mk_prod (HOLogic.mk_literal name, HOLogic.mk_number \<^Type>\<open>int\<close> index_value))
$ reify_typ typ end
| reify_term (Bound i) = \<^Const>\<open>Bv\<close> $ HOLogic.mk_nat i
| reify_term (Abs (_, typ, term)) = \<^Const>\<open>Abs\<close> $ reify_typ typ $ reify_term term
| reify_term (Term.$ (t1, t2)) = \<^Const>\<open>App\<close> $ reify_term t1 $ reify_term t2
fun ML_isa_elaborate_term (thy:theory) _ _ term_option pos =
case term_option of
NONE => error("Wrong term option. You must use a defined term")
| SOME term => let
val term_name = HOLogic.dest_string term
val term = Syntax.read_term_global thy term_name
val term' = DOF_core.transduce_term_global {mk_elaboration=true} (term,pos) thy
val value = Value_Command.value (Proof_Context.init_global thy) term'
in reify_term value end
(*fun ML_isa_elaborate_term' (thy:theory) _ _ term_option pos =
case term_option of
NONE => error("Wrong term option. You must use a defined term")
| SOME term => let
val term_name = HOLogic.dest_string term
val term = Syntax.read_term_global thy term_name
val term' = ML_isa_elaborate_term (thy:theory) _ pos term_option _
in reify_term term end*)
fun reify_proofterm (PBound i) =\<^Const>\<open>PBound\<close> $ (HOLogic.mk_nat i)
| reify_proofterm (Abst (_, typ_option, proof)) =
\<^Const>\<open>Abst\<close> $ reify_typ (the typ_option) $ reify_proofterm proof
| reify_proofterm (AbsP (_, term_option, proof)) =
\<^Const>\<open>AbsP\<close> $ reify_term (the term_option) $ reify_proofterm proof
| reify_proofterm (op % (proof, term_option)) =
\<^Const>\<open>Appt\<close> $ reify_proofterm proof $ reify_term (the term_option)
| reify_proofterm (op %% (proof1, proof2)) =
\<^Const>\<open>AppP\<close> $ reify_proofterm proof1 $ reify_proofterm proof2
| reify_proofterm (Hyp term) = \<^Const>\<open>Hyp\<close> $ (reify_term term)
| reify_proofterm (PAxm (_, term, typ_list_option)) =
let
val tvars = rev (Term.add_tvars term [])
val meta_tvars = map (fn ((name, index_value), sort) =>
HOLogic.mk_prod
(\<^Const>\<open>Var\<close>
$ HOLogic.mk_prod
(HOLogic.mk_literal name, HOLogic.mk_number \<^Type>\<open>int\<close> index_value)
, HOLogic.mk_set \<^typ>\<open>class\<close> (map HOLogic.mk_literal sort))) tvars
val typ_list = case typ_list_option of
NONE => meta_tvars |> map (K dummyT)
| SOME T => T
val meta_typ_list =
HOLogic.mk_list @{typ "tyinst"} (map2 (fn x => fn y => HOLogic.mk_prod (x, y))
meta_tvars (map reify_typ (typ_list)))
in \<^Const>\<open>PAxm\<close> $ reify_term term $ meta_typ_list end
| reify_proofterm (PClass (typ, class)) =
\<^Const>\<open>OfClass\<close> $ reify_typ typ $ HOLogic.mk_literal class
| reify_proofterm (PThm ({prop = prop, types = types, ...}, _)) =
let
val tvars = rev (Term.add_tvars prop [])
val meta_tvars = map (fn ((name, index_value), sort) =>
HOLogic.mk_prod
(\<^Const>\<open>Var\<close>
$ HOLogic.mk_prod
(HOLogic.mk_literal name, HOLogic.mk_number \<^Type>\<open>int\<close> index_value)
, HOLogic.mk_set \<^typ>\<open>class\<close> (map HOLogic.mk_literal sort))) tvars
val meta_typ_list =
HOLogic.mk_list \<^typ>\<open>tyinst\<close> (map2 (fn x => fn y => HOLogic.mk_prod (x, y))
meta_tvars (map reify_typ (the types)))
in \<^Const>\<open>PAxm\<close> $ reify_term prop $ meta_typ_list end
fun ML_isa_elaborate_thm (thy:theory) _ _ term_option pos =
case term_option of
NONE => ISA_core.err ("Malformed term annotation") pos
| SOME term =>
let
val thm_name = HOLogic.dest_string term
(*val _ = writeln ("In ML_isa_elaborate_thm thm_name: " ^ \<^make_string> thm_name)*)
val thm = Proof_Context.get_thm (Proof_Context.init_global thy) thm_name
(*val _ = writeln ("In ML_isa_elaborate_thm thm: " ^ \<^make_string> thm)*)
val body = Proofterm.strip_thm_body (Thm.proof_body_of thm);
val prf = Proofterm.proof_of body;
(* Proof_Syntax.standard_proof_of reconstructs the proof and seems to rewrite
the option arguments (with a value NONE) of the proof datatype constructors,
at least for PAxm, with "SOME (typ/term)",
allowing us the use the projection function "the".
Maybe the function can deal with
all the option types of the proof datatype constructors *)
val proof = Proof_Syntax.standard_proof_of
{full = true, expand_name = Thm.expand_name thm} thm
(*val _ = writeln ("In ML_isa_elaborate_thm proof: " ^ \<^make_string> proof)*)
(* After a small discussion with Simon Roßkopf, It seems preferable to use
Thm.reconstruct_proof_of instead of Proof_Syntax.standard_proof_of
whose operation is not well known.
Thm.reconstruct_proof_of seems sufficient to have a reifiable PAxm
in the metalogic. *)
val proof' = Thm.reconstruct_proof_of thm
(*in \<^Const>\<open>Thm_content\<close> $ reify_proofterm prf end*)
(*in \<^Const>\<open>Thm_content\<close> $ reify_proofterm proof end*)
in \<^Const>\<open>Thm_content\<close> $ reify_proofterm proof end
fun ML_isa_elaborate_thms_of (thy:theory) _ _ term_option pos =
case term_option of
NONE => ISA_core.err ("Malformed term annotation") pos
| SOME term =>
let
val thm_name = HOLogic.dest_string term
val thm = Proof_Context.get_thm (Proof_Context.init_global thy) thm_name
val body = Proofterm.strip_thm_body (Thm.proof_body_of thm)
val all_thms_name = Proofterm.fold_body_thms (fn {name, ...} => insert (op =) name) [body] []
(*val all_thms = map (Proof_Context.get_thm (Proof_Context.init_global thy)) all_thms_name*)
(*val all_proofs = map (Proof_Syntax.standard_proof_of
{full = true, expand_name = Thm.expand_name thm}) all_thms*)
(*in HOLogic.mk_list \<^Type>\<open>thm\<close> (map (fn proof => \<^Const>\<open>Thm_content\<close> $ reify_proofterm proof) all_proofs) end*)
in HOLogic.mk_list \<^typ>\<open>string\<close> (map HOLogic.mk_string all_thms_name) end
fun ML_isa_elaborate_trace_attribute (thy:theory) _ _ term_option pos =
case term_option of
NONE => ISA_core.err ("Malformed term annotation") pos
| SOME term =>
let
val oid = HOLogic.dest_string term
val traces = ISA_core.compute_attr_access (Context.Theory thy) "trace" oid NONE pos
fun conv (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
let val s' = DOF_core.get_onto_class_name_global (HOLogic.dest_string s) thy
in \<^Const>\<open>Pair \<^typ>\<open>string\<close> \<^typ>\<open>string\<close>\<close> $ HOLogic.mk_string s' $ S end
val traces' = map conv (HOLogic.dest_list traces)
in HOLogic.mk_list \<^Type>\<open>prod \<^typ>\<open>string\<close> \<^typ>\<open>string\<close>\<close> traces' end
end; (* struct *)
\<close>
ML\<open>
val ty1 = Meta_ISA_core.reify_typ @{typ "int"}
val ty2 = Meta_ISA_core.reify_typ @{typ "int \<Rightarrow> bool"}
val ty3 = Meta_ISA_core.reify_typ @{typ "prop"}
val ty4 = Meta_ISA_core.reify_typ @{typ "'a list"}
\<close>
ML\<open>
val t1 = Meta_ISA_core.reify_term @{term "1::int"}
val t2 = Meta_ISA_core.reify_term @{term "\<lambda>x. x = 1"}
val t3 = Meta_ISA_core.reify_term @{term "[2, 3::int]"}
\<close>
subsection\<open> Isar - Setup\<close>
(* Isa_transformers declaration for Isabelle_DOF term anti-quotations (typ, term, thm, etc.).
They must be declared in the same theory file as the one of the declaration
of Isabelle_DOF term anti-quotations !!! *)
setup\<open>
[(\<^type_name>\<open>thm\<close>, ISA_core.ML_isa_check_thm, Meta_ISA_core.ML_isa_elaborate_thm)
, (\<^type_name>\<open>thms_of\<close>, ISA_core.ML_isa_check_thm, Meta_ISA_core.ML_isa_elaborate_thms_of)
, (\<^type_name>\<open>file\<close>, ISA_core.ML_isa_check_file, ISA_core.ML_isa_elaborate_generic)]
|> fold (fn (n, check, elaborate) => fn thy =>
let val ns = Sign.tsig_of thy |> Type.type_space
val name = n
val {pos, ...} = Name_Space.the_entry ns name
val bname = Long_Name.base_name name
val binding = Binding.make (bname, pos)
|> Binding.prefix_name DOF_core.ISA_prefix
|> Binding.prefix false bname
in DOF_core.add_isa_transformer binding ((check, elaborate) |> DOF_core.make_isa_transformer) thy
end)
#>
([(\<^const_name>\<open>Isabelle_DOF_typ\<close>, ISA_core.ML_isa_check_typ, Meta_ISA_core.ML_isa_elaborate_typ)
,(\<^const_name>\<open>Isabelle_DOF_term\<close>, ISA_core.ML_isa_check_term, Meta_ISA_core.ML_isa_elaborate_term)
,(\<^const_name>\<open>Isabelle_DOF_docitem\<close>,
ISA_core.ML_isa_check_docitem, ISA_core.ML_isa_elaborate_generic)
,(\<^const_name>\<open>Isabelle_DOF_trace_attribute\<close>,
ISA_core.ML_isa_check_trace_attribute, ISA_core.ML_isa_elaborate_trace_attribute)]
|> fold (fn (n, check, elaborate) => fn thy =>
let val ns = Sign.consts_of thy |> Consts.space_of
val name = n
val {pos, ...} = Name_Space.the_entry ns name
val bname = Long_Name.base_name name
val binding = Binding.make (bname, pos)
in DOF_core.add_isa_transformer binding ((check, elaborate) |> DOF_core.make_isa_transformer) thy
end))
\<close>
end

179
Isabelle_DOF-Proofs/fun.ML Normal file
View File

@ -0,0 +1,179 @@
(* Title: HOL/Tools/Function/fun.ML
Author: Alexander Krauss, TU Muenchen
Command "fun": Function definitions with pattern splitting/completion
and automated termination proofs.
*)
signature FUNCTION_FUN =
sig
val fun_config : Function_Common.function_config
val add_fun : (binding * typ option * mixfix) list ->
Specification.multi_specs -> Function_Common.function_config ->
local_theory -> Proof.context
val add_fun_cmd : (binding * string option * mixfix) list ->
Specification.multi_specs_cmd -> Function_Common.function_config ->
bool -> local_theory -> Proof.context
end
structure Function_Fun : FUNCTION_FUN =
struct
open Function_Lib
open Function_Common
fun check_pats ctxt geq =
let
fun err str = error (cat_lines ["Malformed definition:",
str ^ " not allowed in sequential mode.",
Syntax.string_of_term ctxt geq])
fun check_constr_pattern (Bound _) = ()
| check_constr_pattern t =
let
val (hd, args) = strip_comb t
in
(case hd of
Const (hd_s, hd_T) =>
(case body_type hd_T of
Type (Tname, _) =>
(case Ctr_Sugar.ctr_sugar_of ctxt Tname of
SOME {ctrs, ...} => exists (fn Const (s, _) => s = hd_s) ctrs
| NONE => false)
| _ => false)
| _ => false) orelse err "Non-constructor pattern";
map check_constr_pattern args;
()
end
val (_, qs, gs, args, _) = split_def ctxt (K true) geq
val _ = if not (null gs) then err "Conditional equations" else ()
val _ = map check_constr_pattern args
(* just count occurrences to check linearity *)
val _ = if fold (fold_aterms (fn Bound _ => Integer.add 1 | _ => I)) args 0 > length qs
then err "Nonlinear patterns" else ()
in
()
end
fun mk_catchall fixes arity_of =
let
fun mk_eqn ((fname, fT), _) =
let
val n = arity_of fname
val (argTs, rT) = chop n (binder_types fT)
|> apsnd (fn Ts => Ts ---> body_type fT)
val qs = map Free (Name.invent Name.context "a" n ~~ argTs)
in
HOLogic.mk_eq(list_comb (Free (fname, fT), qs),
Const (\<^const_name>\<open>undefined\<close>, rT))
|> HOLogic.mk_Trueprop
|> fold_rev Logic.all qs
end
in
map mk_eqn fixes
end
fun add_catchall ctxt fixes spec =
let val fqgars = map (split_def ctxt (K true)) spec
val arity_of = map (fn (fname,_,_,args,_) => (fname, length args)) fqgars
|> AList.lookup (op =) #> the
in
spec @ mk_catchall fixes arity_of
end
fun further_checks ctxt origs tss =
let
fun fail_redundant t =
error (cat_lines ["Equation is redundant (covered by preceding clauses):", Syntax.string_of_term ctxt t])
fun warn_missing strs =
warning (cat_lines ("Missing patterns in function definition:" :: strs))
val (tss', added) = chop (length origs) tss
val _ = case chop 3 (flat added) of
([], []) => ()
| (eqs, []) => warn_missing (map (Syntax.string_of_term ctxt) eqs)
| (eqs, rest) => warn_missing (map (Syntax.string_of_term ctxt) eqs
@ ["(" ^ string_of_int (length rest) ^ " more)"])
val _ = (origs ~~ tss')
|> map (fn (t, ts) => if null ts then fail_redundant t else ())
in
()
end
fun sequential_preproc (config as FunctionConfig {sequential, ...}) ctxt fixes spec =
if sequential then
let
val (bnds, eqss) = split_list spec
val eqs = map the_single eqss
val feqs = eqs
|> tap (check_defs ctxt fixes) (* Standard checks *)
|> tap (map (check_pats ctxt)) (* More checks for sequential mode *)
val compleqs = add_catchall ctxt fixes feqs (* Completion *)
val spliteqs = Function_Split.split_all_equations ctxt compleqs
|> tap (further_checks ctxt feqs)
fun restore_spec thms =
bnds ~~ take (length bnds) (unflat spliteqs thms)
val spliteqs' = flat (take (length bnds) spliteqs)
val fnames = map (fst o fst) fixes
val indices = map (fn eq => find_index (curry op = (fname_of eq)) fnames) spliteqs'
fun sort xs = partition_list (fn i => fn (j,_) => i = j) 0 (length fnames - 1) (indices ~~ xs)
|> map (map snd)
val bnds' = bnds @ replicate (length spliteqs - length bnds) Binding.empty_atts
(* using theorem names for case name currently disabled *)
val case_names = map_index (fn (i, (_, es)) => mk_case_names i "" (length es))
(bnds' ~~ spliteqs) |> flat
in
(flat spliteqs, restore_spec, sort, case_names)
end
else
Function_Common.empty_preproc check_defs config ctxt fixes spec
val _ = Theory.setup (Context.theory_map (Function_Common.set_preproc sequential_preproc))
val fun_config = FunctionConfig { sequential=true, default=NONE,
domintros=false, partials=false }
fun gen_add_fun add lthy =
let
fun pat_completeness_auto ctxt =
Pat_Completeness.pat_completeness_tac ctxt 1
THEN auto_tac ctxt
fun prove_termination lthy =
Function.prove_termination NONE (Function_Common.termination_prover_tac false lthy) lthy
in
lthy
|> add pat_completeness_auto |> snd
|> prove_termination |> snd
end
fun add_fun a b c = gen_add_fun (Function.add_function a b c)
fun add_fun_cmd a b c int = gen_add_fun (fn tac => Function.add_function_cmd a b c tac int)
val _ =
Outer_Syntax.local_theory' \<^command_keyword>\<open>fun*\<close>
"define general recursive functions (short version)"
(function_parser fun_config
>> (fn (config, (fixes, specs)) => add_fun_cmd fixes specs config))
end

View File

@ -0,0 +1,288 @@
(* Title: HOL/Tools/Function/function.ML
Author: Alexander Krauss, TU Muenchen
Main entry points to the function package.
*)
signature FUNCTION =
sig
type info = Function_Common.info
val add_function: (binding * typ option * mixfix) list ->
Specification.multi_specs -> Function_Common.function_config ->
(Proof.context -> tactic) -> local_theory -> info * local_theory
val add_function_cmd: (binding * string option * mixfix) list ->
Specification.multi_specs_cmd -> Function_Common.function_config ->
(Proof.context -> tactic) -> bool -> local_theory -> info * local_theory
val function: (binding * typ option * mixfix) list ->
Specification.multi_specs -> Function_Common.function_config ->
local_theory -> Proof.state
val function_cmd: (binding * string option * mixfix) list ->
Specification.multi_specs_cmd -> Function_Common.function_config ->
bool -> local_theory -> Proof.state
val prove_termination: term option -> tactic -> local_theory ->
info * local_theory
val prove_termination_cmd: string option -> tactic -> local_theory ->
info * local_theory
val termination : term option -> local_theory -> Proof.state
val termination_cmd : string option -> local_theory -> Proof.state
val get_congs : Proof.context -> thm list
val get_info : Proof.context -> term -> info
end
structure Function : FUNCTION =
struct
open Function_Lib
open Function_Common
val simp_attribs =
@{attributes [simp, nitpick_simp]}
val psimp_attribs =
@{attributes [nitpick_psimp]}
fun note_derived (a, atts) (fname, thms) =
Local_Theory.note ((derived_name fname a, atts), thms) #> apfst snd
fun add_simps fnames post sort extra_qualify label mod_binding moreatts simps lthy =
let
val spec = post simps
|> map (apfst (apsnd (fn ats => moreatts @ ats)))
|> map (apfst (apfst extra_qualify))
val (saved_spec_simps, lthy') =
fold_map Local_Theory.note spec lthy
val saved_simps = maps snd saved_spec_simps
val simps_by_f = sort saved_simps
fun note fname simps =
Local_Theory.note ((mod_binding (derived_name fname label), []), simps) #> snd
in (saved_simps, fold2 note fnames simps_by_f lthy') end
fun prepare_function do_print prep fixspec eqns config lthy =
let
val ((fixes0, spec0), ctxt') = prep fixspec eqns lthy
val fixes = map (apfst (apfst Binding.name_of)) fixes0
val spec = map (fn (bnd, prop) => (bnd, [prop])) spec0
val (eqs, post, sort_cont, cnames) = get_preproc lthy config ctxt' fixes spec
val fnames = map (fst o fst) fixes0
val defname = Binding.conglomerate fnames;
val FunctionConfig {partials, default, ...} = config
val _ =
if is_some default
then legacy_feature "\"function (default)\" -- use 'partial_function' instead"
else ()
val ((goal_state, cont), lthy') =
Function_Mutual.prepare_function_mutual config defname fixes0 eqs lthy
fun afterqed [[proof]] lthy1 =
let
val result = cont lthy1 (Thm.close_derivation \<^here> proof)
val FunctionResult {fs, R, dom, psimps, simple_pinducts,
termination, domintros, cases, ...} = result
val pelims = Function_Elims.mk_partial_elim_rules lthy1 result
val concealed_partial = if partials then I else Binding.concealed
val addsmps = add_simps fnames post sort_cont
val (((((psimps', [pinducts']), [termination']), cases'), pelims'), lthy2) =
lthy1
|> addsmps (concealed_partial o Binding.qualify false "partial")
"psimps" concealed_partial psimp_attribs psimps
||>> Local_Theory.notes [((concealed_partial (derived_name defname "pinduct"), []),
simple_pinducts |> map (fn th => ([th],
[Attrib.case_names cnames, Attrib.consumes (1 - Thm.nprems_of th)] @
@{attributes [induct pred]})))]
||>> (apfst snd o
Local_Theory.note
((Binding.concealed (derived_name defname "termination"), []), [termination]))
||>> fold_map (note_derived ("cases", [Attrib.case_names cnames]))
(fnames ~~ map single cases)
||>> fold_map (note_derived ("pelims", [Attrib.consumes 1, Attrib.constraints 1]))
(fnames ~~ pelims)
||> (case domintros of NONE => I | SOME thms =>
Local_Theory.note ((derived_name defname "domintros", []), thms) #> snd)
val info =
{ add_simps=addsmps, fnames=fnames, case_names=cnames, psimps=psimps',
pinducts=snd pinducts', simps=NONE, inducts=NONE, termination=termination', totality=NONE,
fs=fs, R=R, dom=dom, defname=defname, is_partial=true, cases=flat cases',
pelims=pelims',elims=NONE}
val _ =
Proof_Display.print_consts do_print (Position.thread_data ()) lthy2
(K false) (map fst fixes)
in
(info,
lthy2 |> Local_Theory.declaration {syntax = false, pervasive = false, pos = \<^here>}
(fn phi => add_function_data (transform_function_data phi info)))
end
in
((goal_state, afterqed), lthy')
end
fun gen_add_function do_print prep fixspec eqns config tac lthy =
let
val ((goal_state, afterqed), lthy') =
prepare_function do_print prep fixspec eqns config lthy
val pattern_thm =
case SINGLE (tac lthy') goal_state of
NONE => error "pattern completeness and compatibility proof failed"
| SOME st => Goal.finish lthy' st
in
lthy'
|> afterqed [[pattern_thm]]
end
val add_function = gen_add_function false Specification.check_multi_specs
fun add_function_cmd a b c d int = gen_add_function int Specification.read_multi_specs a b c d
fun gen_function do_print prep fixspec eqns config lthy =
let
val ((goal_state, afterqed), lthy') =
prepare_function do_print prep fixspec eqns config lthy
in
lthy'
|> Proof.theorem NONE (snd oo afterqed) [[(Logic.unprotect (Thm.concl_of goal_state), [])]]
|> Proof.refine_singleton (Method.primitive_text (K (K goal_state)))
end
val function = gen_function false Specification.check_multi_specs
fun function_cmd a b c int = gen_function int Specification.read_multi_specs a b c
fun prepare_termination_proof prep_binding prep_term raw_term_opt lthy =
let
val term_opt = Option.map (prep_term lthy) raw_term_opt
val info =
(case term_opt of
SOME t =>
(case import_function_data t lthy of
SOME info => info
| NONE => error ("Not a function: " ^ quote (Syntax.string_of_term lthy t)))
| NONE =>
(case import_last_function lthy of
SOME info => info
| NONE => error "Not a function"))
val { termination, fs, R, add_simps, case_names, psimps,
pinducts, defname, fnames, cases, dom, pelims, ...} = info
val domT = domain_type (fastype_of R)
val goal = HOLogic.mk_Trueprop (HOLogic.mk_all ("x", domT, mk_acc domT R $ Free ("x", domT)))
fun afterqed [[raw_totality]] lthy1 =
let
val totality = Thm.close_derivation \<^here> raw_totality
val remove_domain_condition =
full_simplify (put_simpset HOL_basic_ss lthy1
addsimps [totality, @{thm True_implies_equals}])
val tsimps = map remove_domain_condition psimps
val tinduct = map remove_domain_condition pinducts
val telims = map (map remove_domain_condition) pelims
in
lthy1
|> add_simps prep_binding "simps" prep_binding simp_attribs tsimps
||> Code.declare_default_eqns (map (rpair true) tsimps)
||>> Local_Theory.note
((prep_binding (derived_name defname "induct"), [Attrib.case_names case_names]), tinduct)
||>> fold_map (note_derived ("elims", [Attrib.consumes 1, Attrib.constraints 1]))
(map prep_binding fnames ~~ telims)
|-> (fn ((simps,(_,inducts)), elims) => fn lthy2 =>
let val info' = { is_partial=false, defname=defname, fnames=fnames, add_simps=add_simps,
case_names=case_names, fs=fs, R=R, dom=dom, psimps=psimps, pinducts=pinducts,
simps=SOME simps, inducts=SOME inducts, termination=termination, totality=SOME totality,
cases=cases, pelims=pelims, elims=SOME elims}
|> transform_function_data (Morphism.binding_morphism "" prep_binding)
in
(info',
lthy2
|> Local_Theory.declaration {syntax = false, pervasive = false, pos = \<^here>}
(fn phi => add_function_data (transform_function_data phi info'))
|> Spec_Rules.add Binding.empty Spec_Rules.equational_recdef fs tsimps)
end)
end
in
(goal, afterqed, termination)
end
fun gen_prove_termination prep_term raw_term_opt tac lthy =
let
val (goal, afterqed, termination) =
prepare_termination_proof I prep_term raw_term_opt lthy
val totality = Goal.prove lthy [] [] goal (K tac)
in
afterqed [[totality]] lthy
end
val prove_termination = gen_prove_termination Syntax.check_term
val prove_termination_cmd = gen_prove_termination Syntax.read_term
fun gen_termination prep_term raw_term_opt lthy =
let
val (goal, afterqed, termination) =
prepare_termination_proof Binding.reset_pos prep_term raw_term_opt lthy
in
lthy
|> Proof_Context.note_thms ""
((Binding.empty, [Context_Rules.rule_del]), [([allI], [])]) |> snd
|> Proof_Context.note_thms ""
((Binding.empty, [Context_Rules.intro_bang (SOME 1)]), [([allI], [])]) |> snd
|> Proof_Context.note_thms ""
((Binding.name "termination", [Context_Rules.intro_bang (SOME 0)]),
[([Goal.norm_result lthy termination], [])]) |> snd
|> Proof.theorem NONE (snd oo afterqed) [[(goal, [])]]
end
val termination = gen_termination Syntax.check_term
val termination_cmd = gen_termination Syntax.read_term
(* Datatype hook to declare datatype congs as "function_congs" *)
fun add_case_cong n thy =
let
val cong = #case_cong (Old_Datatype_Data.the_info thy n)
|> safe_mk_meta_eq
in
Context.theory_map (Function_Context_Tree.add_function_cong cong) thy
end
val _ = Theory.setup (Old_Datatype_Data.interpretation (K (fold add_case_cong)))
(* get info *)
val get_congs = Function_Context_Tree.get_function_congs
fun get_info ctxt t = Function_Common.retrieve_function_data ctxt t
|> the_single |> snd
(* outer syntax *)
val _ =
Outer_Syntax.local_theory_to_proof' \<^command_keyword>\<open>function*\<close>
"define general recursive functions"
(function_parser default_config
>> (fn (config, (fixes, specs)) => function_cmd fixes specs config))
val _ =
Outer_Syntax.local_theory_to_proof \<^command_keyword>\<open>termination*\<close>
"prove termination of a recursive function"
(Scan.option Parse.term >> termination_cmd)
end

View File

@ -0,0 +1,451 @@
(* Title: Pure/Isar/specification.ML
Author: Makarius
Derived local theory specifications --- with type-inference and
toplevel polymorphism.
*)
signature SPECIFICATION =
sig
val read_props: string list -> (binding * string option * mixfix) list -> Proof.context ->
term list * Proof.context
val check_spec_open: (binding * typ option * mixfix) list ->
(binding * typ option * mixfix) list -> term list -> term -> Proof.context ->
((binding * typ option * mixfix) list * string list * (string -> Position.T list) * term) *
Proof.context
val read_spec_open: (binding * string option * mixfix) list ->
(binding * string option * mixfix) list -> string list -> string -> Proof.context ->
((binding * typ option * mixfix) list * string list * (string -> Position.T list) * term) *
Proof.context
type multi_specs =
((Attrib.binding * term) * term list * (binding * typ option * mixfix) list) list
type multi_specs_cmd =
((Attrib.binding * string) * string list * (binding * string option * mixfix) list) list
val check_multi_specs: (binding * typ option * mixfix) list -> multi_specs -> Proof.context ->
(((binding * typ) * mixfix) list * (Attrib.binding * term) list) * Proof.context
val read_multi_specs: (binding * string option * mixfix) list -> multi_specs_cmd -> Proof.context ->
(((binding * typ) * mixfix) list * (Attrib.binding * term) list) * Proof.context
val axiomatization: (binding * typ option * mixfix) list ->
(binding * typ option * mixfix) list -> term list ->
(Attrib.binding * term) list -> theory -> (term list * thm list) * theory
val axiomatization_cmd: (binding * string option * mixfix) list ->
(binding * string option * mixfix) list -> string list ->
(Attrib.binding * string) list -> theory -> (term list * thm list) * theory
val axiom: Attrib.binding * term -> theory -> thm * theory
val definition: (binding * typ option * mixfix) option ->
(binding * typ option * mixfix) list -> term list -> Attrib.binding * term ->
local_theory -> (term * (string * thm)) * local_theory
val definition_cmd: (binding * string option * mixfix) option ->
(binding * string option * mixfix) list -> string list -> Attrib.binding * string ->
bool -> local_theory -> (term * (string * thm)) * local_theory
val abbreviation: Syntax.mode -> (binding * typ option * mixfix) option ->
(binding * typ option * mixfix) list -> term -> bool -> local_theory -> local_theory
val abbreviation_cmd: Syntax.mode -> (binding * string option * mixfix) option ->
(binding * string option * mixfix) list -> string -> bool -> local_theory -> local_theory
val alias: binding * string -> local_theory -> local_theory
val alias_cmd: binding * (xstring * Position.T) -> local_theory -> local_theory
val type_alias: binding * string -> local_theory -> local_theory
val type_alias_cmd: binding * (xstring * Position.T) -> local_theory -> local_theory
val theorems: string ->
(Attrib.binding * Attrib.thms) list ->
(binding * typ option * mixfix) list ->
bool -> local_theory -> (string * thm list) list * local_theory
val theorems_cmd: string ->
(Attrib.binding * (Facts.ref * Token.src list) list) list ->
(binding * string option * mixfix) list ->
bool -> local_theory -> (string * thm list) list * local_theory
val theorem: bool -> string -> Method.text option ->
(thm list list -> local_theory -> local_theory) -> Attrib.binding ->
string list -> Element.context_i list -> Element.statement_i ->
bool -> local_theory -> Proof.state
val theorem_cmd: bool -> string -> Method.text option ->
(thm list list -> local_theory -> local_theory) -> Attrib.binding ->
(xstring * Position.T) list -> Element.context list -> Element.statement ->
bool -> local_theory -> Proof.state
val schematic_theorem: bool -> string -> Method.text option ->
(thm list list -> local_theory -> local_theory) -> Attrib.binding ->
string list -> Element.context_i list -> Element.statement_i ->
bool -> local_theory -> Proof.state
val schematic_theorem_cmd: bool -> string -> Method.text option ->
(thm list list -> local_theory -> local_theory) -> Attrib.binding ->
(xstring * Position.T) list -> Element.context list -> Element.statement ->
bool -> local_theory -> Proof.state
end;
structure Specification: SPECIFICATION =
struct
(* prepare propositions *)
fun read_props raw_props raw_fixes ctxt =
let
val (_, ctxt1) = ctxt |> Proof_Context.add_fixes_cmd raw_fixes;
val props1 = map (Syntax.parse_prop ctxt1) raw_props;
val (props2, ctxt2) = ctxt1 |> fold_map Variable.fix_dummy_patterns props1;
val props3 = Syntax.check_props ctxt2 props2;
val ctxt3 = ctxt2 |> fold Variable.declare_term props3;
in (props3, ctxt3) end;
(* prepare specification *)
fun get_positions ctxt x =
let
fun get Cs (Const ("_type_constraint_", C) $ t) = get (C :: Cs) t
| get Cs (Free (y, T)) =
if x = y then
map_filter Term_Position.decode_positionT
(T :: map (Type.constraint_type ctxt) Cs)
else []
| get _ (t $ u) = get [] t @ get [] u
| get _ (Abs (_, _, t)) = get [] t
| get _ _ = [];
in get [] end;
local
fun prep_decls prep_var raw_vars ctxt =
let
val (vars, ctxt') = fold_map prep_var raw_vars ctxt;
val (xs, ctxt'') = ctxt'
|> Context_Position.set_visible false
|> Proof_Context.add_fixes vars
||> Context_Position.restore_visible ctxt';
val _ =
Context_Position.reports ctxt''
(map (Binding.pos_of o #1) vars ~~
map (Variable.markup_entity_def ctxt'' ##> Properties.remove Markup.kindN) xs);
in ((vars, xs), ctxt'') end;
fun close_form ctxt ys prems concl =
let
val xs = rev (fold (Variable.add_free_names ctxt) (prems @ [concl]) (rev ys));
val pos_props = Logic.strip_imp_concl concl :: Logic.strip_imp_prems concl @ prems;
fun get_pos x = maps (get_positions ctxt x) pos_props;
val _ = Context_Position.reports ctxt (maps (Syntax_Phases.reports_of_scope o get_pos) xs);
in Logic.close_prop_constraint (Variable.default_type ctxt) (xs ~~ xs) prems concl end;
fun dummy_frees ctxt xs tss =
let
val names =
Variable.names_of ((fold o fold) Variable.declare_term tss ctxt)
|> fold Name.declare xs;
val (tss', _) = (fold_map o fold_map) Term.free_dummy_patterns tss names;
in tss' end;
fun prep_spec_open prep_var parse_prop raw_vars raw_params raw_prems raw_concl ctxt =
let
val ((vars, xs), vars_ctxt) = prep_decls prep_var raw_vars ctxt;
val (ys, params_ctxt) = vars_ctxt |> fold_map prep_var raw_params |-> Proof_Context.add_fixes;
val props =
map (parse_prop params_ctxt) (raw_concl :: raw_prems)
|> singleton (dummy_frees params_ctxt (xs @ ys));
val concl :: prems = Syntax.check_props params_ctxt props;
val spec = Logic.list_implies (prems, concl);
val spec_ctxt = Variable.declare_term spec params_ctxt;
fun get_pos x = maps (get_positions spec_ctxt x) props;
in ((vars, xs, get_pos, spec), spec_ctxt) end;
fun prep_specs prep_var parse_prop prep_att raw_vars raw_specss ctxt =
let
val ((vars, xs), vars_ctxt) = prep_decls prep_var raw_vars ctxt;
val propss0 =
raw_specss |> map (fn ((_, raw_concl), raw_prems, raw_params) =>
let val (ys, ctxt') = vars_ctxt |> fold_map prep_var raw_params |-> Proof_Context.add_fixes
in (ys, map (pair ctxt') (raw_concl :: raw_prems)) end);
val props =
burrow (grouped 10 Par_List.map_independent (uncurry parse_prop)) (map #2 propss0)
|> dummy_frees vars_ctxt xs
|> map2 (fn (ys, _) => fn concl :: prems => close_form vars_ctxt ys prems concl) propss0;
val specs = Syntax.check_props vars_ctxt props;
val specs' = specs |> map (DOF_core.elaborate_term ctxt)
val specs_ctxt = vars_ctxt |> fold Variable.declare_term specs;
val ps = specs_ctxt |> fold_map Proof_Context.inferred_param xs |> fst;
val params = map2 (fn (b, _, mx) => fn (_, T) => ((b, T), mx)) vars ps;
val name_atts: Attrib.binding list =
map (fn ((name, atts), _) => (name, map (prep_att ctxt) atts)) (map #1 raw_specss);
in ((params, name_atts ~~ specs'), specs_ctxt) end;
in
val check_spec_open = prep_spec_open Proof_Context.cert_var (K I);
val read_spec_open = prep_spec_open Proof_Context.read_var Syntax.parse_prop;
type multi_specs =
((Attrib.binding * term) * term list * (binding * typ option * mixfix) list) list;
type multi_specs_cmd =
((Attrib.binding * string) * string list * (binding * string option * mixfix) list) list;
fun check_multi_specs xs specs =
prep_specs Proof_Context.cert_var (K I) (K I) xs specs;
fun read_multi_specs xs specs =
prep_specs Proof_Context.read_var Syntax.parse_prop Attrib.check_src xs specs;
end;
(* axiomatization -- within global theory *)
fun gen_axioms prep_stmt prep_att raw_decls raw_fixes raw_prems raw_concls thy =
let
(*specification*)
val ({vars, propss = [prems, concls], ...}, vars_ctxt) =
Proof_Context.init_global thy
|> prep_stmt (raw_decls @ raw_fixes) ((map o map) (rpair []) [raw_prems, map snd raw_concls]);
val (decls, fixes) = chop (length raw_decls) vars;
val frees =
rev ((fold o fold) (Variable.add_frees vars_ctxt) [prems, concls] [])
|> map (fn (x, T) => (x, Free (x, T)));
val close = Logic.close_prop (map #2 fixes @ frees) prems;
val specs =
map ((apsnd o map) (prep_att vars_ctxt) o fst) raw_concls ~~ map close concls;
val spec_name =
Binding.conglomerate (if null decls then map (#1 o #1) specs else map (#1 o #1) decls);
(*consts*)
val (consts, consts_thy) = thy
|> fold_map (fn ((b, _, mx), (_, t)) => Theory.specify_const ((b, Term.type_of t), mx)) decls;
val subst = Term.subst_atomic (map (#2 o #2) decls ~~ consts);
(*axioms*)
val (axioms, axioms_thy) =
(specs, consts_thy) |-> fold_map (fn ((b, atts), prop) =>
Thm.add_axiom_global (b, subst prop) #>> (fn (_, th) => ((b, atts), [([th], [])])));
(*facts*)
val (facts, facts_lthy) = axioms_thy
|> Named_Target.theory_init
|> Spec_Rules.add spec_name Spec_Rules.Unknown consts (maps (maps #1 o #2) axioms)
|> Local_Theory.notes axioms;
in ((consts, map (the_single o #2) facts), Local_Theory.exit_global facts_lthy) end;
val axiomatization = gen_axioms Proof_Context.cert_stmt (K I);
val axiomatization_cmd = gen_axioms Proof_Context.read_stmt Attrib.check_src;
fun axiom (b, ax) = axiomatization [] [] [] [(b, ax)] #>> (hd o snd);
(* definition *)
fun gen_def prep_spec prep_att raw_var raw_params raw_prems ((a, raw_atts), raw_spec) int lthy =
let
val atts = map (prep_att lthy) raw_atts;
val ((vars, xs, get_pos, spec), _) = lthy
|> prep_spec (the_list raw_var) raw_params raw_prems raw_spec;
val (((x, T), rhs), prove) = Local_Defs.derived_def lthy get_pos {conditional = true} spec;
val _ = Name.reject_internal (x, []);
val (b, mx) =
(case (vars, xs) of
([], []) => (Binding.make (x, (case get_pos x of [] => Position.none | p :: _ => p)), NoSyn)
| ([(b, _, mx)], [y]) =>
if x = y then (b, mx)
else
error ("Head of definition " ^ quote x ^ " differs from declaration " ^ quote y ^
Position.here (Binding.pos_of b)));
val name = Thm.def_binding_optional b a;
val ((lhs, (_, raw_th)), lthy2) = lthy
|> Local_Theory.define_internal ((b, mx), ((Binding.suffix_name "_raw" name, []), rhs));
val ([(def_name, [th])], lthy3) = lthy2
|> Local_Theory.notes [((name, atts), [([prove lthy2 raw_th], [])])];
val lthy4 = lthy3
|> Spec_Rules.add name Spec_Rules.equational [lhs] [th]
|> Code.declare_default_eqns [(th, true)];
val lhs' = Morphism.term (Local_Theory.target_morphism lthy4) lhs;
val _ =
Proof_Display.print_consts int (Position.thread_data ()) lthy4
(Frees.defined (Frees.build (Frees.add_frees lhs'))) [(x, T)];
in ((lhs, (def_name, th)), lthy4) end;
fun definition xs ys As B = gen_def check_spec_open (K I) xs ys As B false;
val definition_cmd = gen_def read_spec_open Attrib.check_src;
(* abbreviation *)
fun gen_abbrev prep_spec mode raw_var raw_params raw_spec int lthy =
let
val lthy1 = lthy |> Proof_Context.set_syntax_mode mode;
val ((vars, xs, get_pos, spec), _) = lthy
|> Proof_Context.set_mode Proof_Context.mode_abbrev
|> prep_spec (the_list raw_var) raw_params [] raw_spec;
val ((x, T), rhs) = Local_Defs.abs_def (#2 (Local_Defs.cert_def lthy1 get_pos spec));
val _ = Name.reject_internal (x, []);
val (b, mx) =
(case (vars, xs) of
([], []) => (Binding.make (x, (case get_pos x of [] => Position.none | p :: _ => p)), NoSyn)
| ([(b, _, mx)], [y]) =>
if x = y then (b, mx)
else
error ("Head of abbreviation " ^ quote x ^ " differs from declaration " ^ quote y ^
Position.here (Binding.pos_of b)));
val lthy2 = lthy1
|> Local_Theory.abbrev mode ((b, mx), rhs) |> snd
|> Proof_Context.restore_syntax_mode lthy;
val _ = Proof_Display.print_consts int (Position.thread_data ()) lthy2 (K false) [(x, T)];
in lthy2 end;
val abbreviation = gen_abbrev check_spec_open;
val abbreviation_cmd = gen_abbrev read_spec_open;
(* alias *)
fun gen_alias decl check (b, arg) lthy =
let
val (c, reports) = check {proper = true, strict = false} lthy arg;
val _ = Context_Position.reports lthy reports;
in decl b c lthy end;
val alias =
gen_alias Local_Theory.const_alias (K (K (fn c => (c, []))));
val alias_cmd =
gen_alias Local_Theory.const_alias
(fn flags => fn ctxt => fn (c, pos) =>
apfst (#1 o dest_Const) (Proof_Context.check_const flags ctxt (c, [pos])));
val type_alias =
gen_alias Local_Theory.type_alias (K (K (fn c => (c, []))));
val type_alias_cmd =
gen_alias Local_Theory.type_alias (apfst (#1 o dest_Type) ooo Proof_Context.check_type_name);
(* fact statements *)
local
fun gen_theorems prep_fact prep_att add_fixes
kind raw_facts raw_fixes int lthy =
let
val facts = raw_facts |> map (fn ((name, atts), bs) =>
((name, map (prep_att lthy) atts),
bs |> map (fn (b, more_atts) => (prep_fact lthy b, map (prep_att lthy) more_atts))));
val (_, ctxt') = add_fixes raw_fixes lthy;
val facts' = facts
|> Attrib.partial_evaluation ctxt'
|> Attrib.transform_facts (Proof_Context.export_morphism ctxt' lthy);
val (res, lthy') = lthy |> Local_Theory.notes_kind kind facts';
val _ =
Proof_Display.print_results
{interactive = int, pos = Position.thread_data (), proof_state = false}
lthy' ((kind, ""), res);
in (res, lthy') end;
in
val theorems = gen_theorems (K I) (K I) Proof_Context.add_fixes;
val theorems_cmd = gen_theorems Proof_Context.get_fact Attrib.check_src Proof_Context.add_fixes_cmd;
end;
(* complex goal statements *)
local
fun prep_statement prep_att prep_stmt raw_elems raw_stmt ctxt =
let
val (stmt, elems_ctxt) = prep_stmt raw_elems raw_stmt ctxt;
val prems = Assumption.local_prems_of elems_ctxt ctxt;
val stmt_ctxt = fold (fold (Proof_Context.augment o fst) o snd) stmt elems_ctxt;
in
(case raw_stmt of
Element.Shows _ =>
let val stmt' = Attrib.map_specs (map prep_att) stmt
in (([], prems, stmt', NONE), stmt_ctxt) end
| Element.Obtains raw_obtains =>
let
val asms_ctxt = stmt_ctxt
|> fold (fn ((name, _), asm) =>
snd o Proof_Context.add_assms Assumption.assume_export
[((name, [Context_Rules.intro_query NONE]), asm)]) stmt;
val that = Assumption.local_prems_of asms_ctxt stmt_ctxt;
val ([(_, that')], that_ctxt) = asms_ctxt
|> Proof_Context.set_stmt true
|> Proof_Context.note_thmss "" [((Binding.name Auto_Bind.thatN, []), [(that, [])])]
||> Proof_Context.restore_stmt asms_ctxt;
val stmt' = [(Binding.empty_atts, [(#2 (#1 (Obtain.obtain_thesis ctxt)), [])])];
in ((Obtain.obtains_attribs raw_obtains, prems, stmt', SOME that'), that_ctxt) end)
end;
fun gen_theorem schematic bundle_includes prep_att prep_stmt
long kind before_qed after_qed (name, raw_atts) raw_includes raw_elems raw_concl int lthy =
let
val _ = Local_Theory.assert lthy;
val elems = raw_elems |> map (Element.map_ctxt_attrib (prep_att lthy));
val ((more_atts, prems, stmt, facts), goal_ctxt) = lthy
|> bundle_includes raw_includes
|> prep_statement (prep_att lthy) prep_stmt elems raw_concl;
val atts = more_atts @ map (prep_att lthy) raw_atts;
val pos = Position.thread_data ();
val print_results =
Proof_Display.print_results {interactive = int, pos = pos, proof_state = false};
fun after_qed' results goal_ctxt' =
let
val results' =
burrow (map (Goal.norm_result lthy) o Proof_Context.export goal_ctxt' lthy) results;
val (res, lthy') =
if forall (Binding.is_empty_atts o fst) stmt then (map (pair "") results', lthy)
else
Local_Theory.notes_kind kind
(map2 (fn (b, _) => fn ths => (b, [(ths, [])])) stmt results') lthy;
val lthy'' =
if Binding.is_empty_atts (name, atts)
then (print_results lthy' ((kind, ""), res); lthy')
else
let
val ([(res_name, _)], lthy'') =
Local_Theory.notes_kind kind [((name, atts), [(maps #2 res, [])])] lthy';
val _ = print_results lthy' ((kind, res_name), res);
in lthy'' end;
in after_qed results' lthy'' end;
val prems_name = if long then Auto_Bind.assmsN else Auto_Bind.thatN;
in
goal_ctxt
|> not (null prems) ?
(Proof_Context.note_thmss "" [((Binding.name prems_name, []), [(prems, [])])] #> snd)
|> Proof.theorem before_qed after_qed' (map snd stmt)
|> (case facts of NONE => I | SOME ths => Proof.refine_insert ths)
|> tap (fn state => not schematic andalso Proof.schematic_goal state andalso
error "Illegal schematic goal statement")
end;
in
val theorem =
gen_theorem false Bundle.includes (K I) Expression.cert_statement;
val theorem_cmd =
gen_theorem false Bundle.includes_cmd Attrib.check_src Expression.read_statement;
val schematic_theorem =
gen_theorem true Bundle.includes (K I) Expression.cert_statement;
val schematic_theorem_cmd =
gen_theorem true Bundle.includes_cmd Attrib.check_src Expression.read_statement;
end;
end;

View File

@ -0,0 +1,237 @@
(*************************************************************************
* Copyright (C)
* 2019 The University of Exeter
* 2018-2019 The University of Paris-Saclay
* 2018 The University of Sheffield
*
* License:
* This program can be redistributed and/or modified under the terms
* of the 2-clause BSD-style license.
*
* SPDX-License-Identifier: BSD-2-Clause
*************************************************************************)
chapter\<open>Testing Freeform and Formal Elements from the scholarly-paper Ontology\<close>
theory
AssnsLemmaThmEtc
imports
"Isabelle_DOF-Ontologies.Conceptual"
"Isabelle_DOF.scholarly_paper"
"Isabelle_DOF_Unit_Tests_document"
TestKit
begin
section\<open>Test Objective\<close>
text\<open>Testing Core Elements for \<^theory>\<open>Isabelle_DOF.scholarly_paper\<close> wrt. to
existance, controlability via implicit and explicit default classes, and potential
LaTeX Layout.\<close>
text\<open>Current status:\<close>
print_doc_classes
print_doc_items
section\<open>An Example for use-before-declaration of Formal Content\<close>
text*[aa::F, properties = "[@{term ''True''}]"]
\<open>Our definition of the HOL-Logic has the following properties:\<close>
assert*\<open>F.properties @{F \<open>aa\<close>} = [@{term ''True''}]\<close>
text\<open>For now, as the term annotation is not bound to a meta logic which will translate
\<^term>\<open>[@{term ''True''}]\<close> to \<^term>\<open>[True]\<close>, we can not use the HOL \<^const>\<open>True\<close> constant
in the assertion.\<close>
ML\<open> @{term_ "[@{term \<open>True \<longrightarrow> True \<close>}]"}; (* with isa-check *) \<close>
ML\<open>
(* Checking the default classes which should be in a neutral(unset) state. *)
(* Note that in this state, the "implicit default" is "math_content". *)
@{assert} (Config.get_global @{theory} Definition_default_class = "");
@{assert} (Config.get_global @{theory} Lemma_default_class = "");
@{assert} (Config.get_global @{theory} Theorem_default_class = "");
@{assert} (Config.get_global @{theory} Proposition_default_class = "");
@{assert} (Config.get_global @{theory} Premise_default_class = "");
@{assert} (Config.get_global @{theory} Corollary_default_class = "");
@{assert} (Config.get_global @{theory} Consequence_default_class = "");
@{assert} (Config.get_global @{theory} Assumption_default_class = "");
@{assert} (Config.get_global @{theory} Hypothesis_default_class = "");
@{assert} (Config.get_global @{theory} Consequence_default_class = "");
@{assert} (Config.get_global @{theory} Assertion_default_class = "");
@{assert} (Config.get_global @{theory} Proof_default_class = "");
@{assert} (Config.get_global @{theory} Example_default_class = "");
\<close>
Definition*[e1]\<open>Lorem ipsum dolor sit amet, ... \<close>
text\<open>Note that this should yield a warning since \<^theory_text>\<open>Definition*\<close> uses as "implicit default" the class
\<^doc_class>\<open>math_content\<close> which has no \<^term>\<open>text_element.level\<close> set, however in this context,
it is required to be a positive number since it is \<^term>\<open>text_element.referentiable\<close> .
This is intended behaviour in order to give the user a nudge to be more specific.\<close>
text\<open>A repair looks like this:\<close>
declare [[Definition_default_class = "definition"]]
text\<open>Now, define a forward reference to the formal content: \<close>
declare_reference*[e1bisbis::"definition"]
text\<open>... which makes it possible to refer in a freeform definition to its formal counterpart
which will appear textually later. With this pragmatics, an "out-of- order-presentation"
can be achieved within \<^theory>\<open>Isabelle_DOF.scholarly_paper\<close> for the most common cases.\<close>
(*<*) (* PDF references to definition* not implemented *)
Definition*[e1bis::"definition", short_name="\<open>Nice lemma.\<close>"]
\<open>Lorem ipsum dolor sit amet, ...
This is formally defined as follows in @{definition (unchecked) "e1bisbis"}\<close>
definition*[e1bisbis, status=formal] e :: int where "e = 2"
(*>*)
section\<open>Tests for Theorems, Assertions, Assumptions, Hypothesis, etc.\<close>
declare [[Theorem_default_class = "theorem",
Premise_default_class = "premise",
Hypothesis_default_class = "hypothesis",
Assumption_default_class = "assumption",
Conclusion_default_class = "conclusion",
Consequence_default_class = "consequence",
Assertion_default_class = "assertion",
Corollary_default_class = "corollary",
Proof_default_class = "math_proof",
Conclusion_default_class = "conclusion_stmt"]]
Theorem*[e2]\<open>... suspendisse non arcu malesuada mollis, nibh morbi, ... \<close>
theorem*[e2bis::"theorem", status=formal] f : "e = 1+1" unfolding e_def by simp
(*<*) (* @{theorem "e2bis"} breaks LaTeX generation ... *)
Lemma*[e3,level="Some 2"]
\<open>... phasellus amet id massa nunc, pede suscipit repellendus, ... @{theorem "e2bis"} \<close>
(*>*)
Proof*[d10, short_name="\<open>Induction over Tinea pedis.\<close>"]\<open>Freeform Proof\<close>
lemma*[dfgd::"lemma"] q: "All (\<lambda>x. X \<and> Y \<longrightarrow> True)" oops
text-assert-error\<open>@{lemma dfgd} \<close>\<open>Undefined instance:\<close> \<comment> \<open>oopsed objects are not referentiable.\<close>
text\<open>... in ut tortor eleifend augue pretium consectetuer...
Lectus accumsan velit ultrices, ...\<close>
Proposition*[d2::"proposition"]\<open>"Freeform Proposition"\<close>
Assumption*[d3] \<open>"Freeform Assertion"\<close>
Premise*[d4]\<open>"Freeform Premise"\<close>
Corollary*[d5]\<open>"Freeform Corollary"\<close>
Consequence*[d6::scholarly_paper.consequence]\<open>"Freeform Consequence"\<close> \<comment> \<open>longname just for test\<close>
(*<*)
declare_reference*[ababa::scholarly_paper.assertion]
Assertion*[d7]\<open>Freeform Assumption with forward reference to the formal
@{assertion (unchecked) ababa}.\<close>
assert*[ababa::assertion] "3 < (4::int)"
assert*[ababab::assertion] "0 < (4::int)"
(*>*)
Conclusion*[d8]\<open>"Freeform Conclusion"\<close>
Hypothesis*[d9]\<open>"Freeform Hypothesis"\<close>
Example*[d11::math_example]\<open>"Freeform Example"\<close>
text\<open>An example for the ontology specification character of the short-cuts such as
@{command "assert*"}: in the following, we use the same notation referring to a completely
different class. "F" and "assertion" have only in common that they posses the attribute
@{const [names_short] \<open>properties\<close>}: \<close>
section\<open>Exhaustive Scholarly\_paper Test\<close>
subsection\<open>Global Structural Elements\<close>
(* maybe it is neither necessary nor possible to test these here... title is unique in
a document, for example. To be commented out of needed. *)
text*[tt1::scholarly_paper.title]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt2::scholarly_paper.author]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt3::scholarly_paper.article]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt4::scholarly_paper.annex]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt5::scholarly_paper.abstract]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt6::scholarly_paper.subtitle]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt7::scholarly_paper.bibliography]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt8::scholarly_paper.introduction]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt9::scholarly_paper.related_work]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt11::scholarly_paper.text_section]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt12::scholarly_paper.background ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt13::scholarly_paper.conclusion ]\<open>Lectus accumsan velit ultrices, ...\<close>
subsection\<open>Technical Content Specific Elements\<close>
text*[tu1::scholarly_paper.axiom ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu1bis::scholarly_paper.math_content, mcc="axm" ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu2::scholarly_paper.lemma ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu3::scholarly_paper.example ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu4::scholarly_paper.premise ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu5::scholarly_paper.theorem ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu6::scholarly_paper.assertion]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu7::scholarly_paper.corollary]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu9::scholarly_paper.technical]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu10::scholarly_paper.assumption ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu13::scholarly_paper.definition ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu15::scholarly_paper.experiment ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu16::scholarly_paper.hypothesis ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu17::scholarly_paper.math_proof ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu18::scholarly_paper.consequence]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu19::scholarly_paper.math_formal]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu20::scholarly_paper.proposition]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu21::scholarly_paper.math_content ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu22::scholarly_paper.math_example ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu23::scholarly_paper.conclusion_stmt ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu24::scholarly_paper.math_motivation ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu25::scholarly_paper.tech_definition ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu28::scholarly_paper.eng_example ]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tt10::scholarly_paper.tech_example]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu8::scholarly_paper.tech_code] \<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu27::scholarly_paper.engineering_content]\<open>Lectus accumsan velit ultrices, ...\<close>
text*[tu14::scholarly_paper.evaluation ]\<open>Lectus accumsan velit ultrices, ...\<close>
text\<open> @{axiom tu1} @{lemma tu2} @{example tu3} @{premise tu4} @{theorem tu5} @{assertion tu6}
@{technical tu9} @{assumption tu10 } @{definition tu13 }
@{experiment tu15 } @{hypothesis tu16 } @{math_proof tu17 }
@{consequence tu18 } @{math_formal tu19 } @{proposition tu20 }
@{math_content tu21 } @{math_example tu22 } @{conclusion_stmt tu23 }
@{math_motivation tu24 } @{tech_definition tu25 } @{eng_example tu28 }
@{tech_example tt10 } @{tech_code tu8 } @{engineering_content tu27 }
@{evaluation tu14 }
\<close>
subsection\<open>The Use in Macros\<close>
Lemma*[ttu2::scholarly_paper.lemma ]\<open>Lectus accumsan velit ultrices, ...\<close>
Example*[ttu3::scholarly_paper.math_example ]\<open>Lectus accumsan velit ultrices, ...\<close>
Premise*[ttu4::scholarly_paper.premise ]\<open>Lectus accumsan velit ultrices, ...\<close>
Theorem*[ttu5::scholarly_paper.theorem ]\<open>Lectus accumsan velit ultrices, ...\<close>
Assertion*[ttu6::scholarly_paper.assertion]\<open>Lectus accumsan velit ultrices, ...\<close>
Corollary*[ttu7::scholarly_paper.corollary]\<open>Lectus accumsan velit ultrices, ...\<close>
Assumption*[ttu10::scholarly_paper.assumption ]\<open>Lectus accumsan velit ultrices, ...\<close>
Definition*[ttu13::scholarly_paper.definition ]\<open>Lectus accumsan velit ultrices, ...\<close>
Hypothesis*[ttu16::scholarly_paper.hypothesis ]\<open>Lectus accumsan velit ultrices, ...\<close>
Proof*[ttu17::scholarly_paper.math_proof ]\<open>Lectus accumsan velit ultrices, ...\<close>
Consequence*[ttu18::scholarly_paper.consequence]\<open>Lectus accumsan velit ultrices, ...\<close>
Proposition*[ttu20::scholarly_paper.proposition]\<open>Lectus accumsan velit ultrices, ...\<close>
Conclusion*[ttu23::scholarly_paper.conclusion_stmt ]\<open>Lectus accumsan velit ultrices, ...\<close>
(* Definition*[ttu25::scholarly_paper.tech_definition ]\<open>Lectus accumsan velit ultrices, ...\<close>
interesting modeling bug.
*)
(*Example*[ttu28::scholarly_paper.eng_example ]\<open>Lectus accumsan velit ultrices, ...\<close>
interesting modeling bug.
*)
text\<open> @{lemma ttu2} @{math_example ttu3} @{premise ttu4} @{theorem ttu5} @{assertion ttu6}
@{assumption ttu10 } @{definition ttu13 }
@{hypothesis ttu16 } @{math_proof ttu17 }
@{consequence ttu18 } @{proposition ttu20 }
@{math_content tu21 } @{conclusion_stmt ttu23 }
@ \<open>{eng_example ttu28 }\<close>
@ \<open>{tech_example tt10 }\<close>
\<close>
end

Some files were not shown because too many files have changed in this diff Show More