Compare commits
475 Commits
main
...
dedukti-pr
Author | SHA1 | Date |
---|---|---|
Nicolas Méric | fda02be889 | |
Nicolas Méric | 989ab3c315 | |
Nicolas Méric | 7b54bf5ca5 | |
Nicolas Méric | baa36b10c1 | |
Nicolas Méric | c57ce6292b | |
Achim D. Brucker | b698572146 | |
Achim D. Brucker | e12abadc94 | |
Achim D. Brucker | 792fd60055 | |
Nicolas Méric | ec7297f1d3 | |
Achim D. Brucker | e4ee3ff240 | |
Achim D. Brucker | 4393042f2c | |
Achim D. Brucker | fef7b9d60b | |
Achim D. Brucker | ab7d695a77 | |
Achim D. Brucker | c063287947 | |
Achim D. Brucker | 342984df3b | |
Achim D. Brucker | 5a8e79fb7e | |
Achim D. Brucker | d7f9f10ef1 | |
Achim D. Brucker | 0a3259fbca | |
Nicolas Méric | ca7cdec9b4 | |
Nicolas Méric | 43aad517b9 | |
Nicolas Méric | 8d6c8929e2 | |
Nicolas Méric | b447a480fb | |
Nicolas Méric | a78397693e | |
Nicolas Méric | 9812bc0517 | |
Nicolas Méric | b364880bfc | |
Nicolas Méric | 5a7cbf2da5 | |
Nicolas Méric | 7f7780f8fd | |
Nicolas Méric | 889805cccc | |
Nicolas Méric | 5a07aa2453 | |
Nicolas Méric | cef4086029 | |
Nicolas Méric | 9df276ac6f | |
Nicolas Méric | b4f1b8c321 | |
Nicolas Méric | 59b082d09d | |
Achim D. Brucker | 1869a96b2d | |
Achim D. Brucker | e95c6386af | |
Achim D. Brucker | 23a85cc8c2 | |
Achim D. Brucker | ddcfb5f708 | |
Achim D. Brucker | 02d13cdcad | |
Achim D. Brucker | d353ff07cc | |
Achim D. Brucker | 38035785da | |
Achim D. Brucker | 7e7c197ac3 | |
Nicolas Méric | 4f8e588138 | |
Nicolas Méric | 2c0b51779e | |
Nicolas Méric | 350ff6fe76 | |
Achim D. Brucker | c803474950 | |
Achim D. Brucker | e17f09e624 | |
Achim D. Brucker | 8051d4233e | |
Nicolas Méric | b4b63ce989 | |
Achim D. Brucker | 2dc16b263f | |
Achim D. Brucker | 5754bb4adc | |
Achim D. Brucker | c7debc577b | |
Achim D. Brucker | 9c94593f45 | |
Nicolas Méric | 4d89250606 | |
Achim D. Brucker | 3f06320034 | |
Achim D. Brucker | 49faed4faf | |
Achim D. Brucker | 1a22441f3e | |
Achim D. Brucker | df1b2c9904 | |
Achim D. Brucker | 9064cd3f62 | |
Nicolas Méric | f5b8d4348b | |
Achim D. Brucker | d225a3253c | |
Achim D. Brucker | 2ee0bc5074 | |
Achim D. Brucker | 9683ea7efa | |
Burkhart Wolff | bce097b1d6 | |
Nicolas Méric | 65d6fb946d | |
Achim D. Brucker | 060f2aca89 | |
Nicolas Méric | ba7c0711a8 | |
Achim D. Brucker | 4adbe4ce81 | |
Achim D. Brucker | 7e698a9e69 | |
Achim D. Brucker | 2569db05c3 | |
Nicolas Méric | cd311d8a3a | |
Achim D. Brucker | fb69f05ac0 | |
Achim D. Brucker | 1986d0bcbd | |
Achim D. Brucker | bbac65e233 | |
Achim D. Brucker | 9cd34d7815 | |
Achim D. Brucker | 641bea4a58 | |
Burkhart Wolff | d0cd28a45c | |
Burkhart Wolff | db4290428f | |
Burkhart Wolff | 43da6d3197 | |
Achim D. Brucker | a93046beac | |
Nicolas Méric | b8282b771e | |
Burkhart Wolff | 1cfc4ac88a | |
Burkhart Wolff | e9044e8d5a | |
Achim D. Brucker | 6bab138af6 | |
Achim D. Brucker | fcc25f7450 | |
Burkhart Wolff | e97cca1a2c | |
Burkhart Wolff | 33fd1453a0 | |
Burkhart Wolff | 543c647bcc | |
Burkhart Wolff | f7141f0df8 | |
Burkhart Wolff | 514ebee17c | |
Burkhart Wolff | bdc8477f38 | |
Nicolas Méric | 7e01b7de97 | |
Burkhart Wolff | 8bdd40fc20 | |
Idir Ait-Sadoune | 9cc03c0816 | |
Idir Ait-Sadoune | e9cfcdbcbc | |
Burkhart Wolff | 36740bf72b | |
Burkhart Wolff | b8da1a304a | |
Burkhart Wolff | 5b519fcbe6 | |
Burkhart Wolff | 50da7670cf | |
Achim D. Brucker | 09d1b27f10 | |
Achim D. Brucker | 34e23b314f | |
Burkhart Wolff | 0aa9f1ff25 | |
Achim D. Brucker | 3f8fc4f16f | |
Achim D. Brucker | b62b391410 | |
Achim D. Brucker | 41a4f38478 | |
Burkhart Wolff | ca8671ee1c | |
Burkhart Wolff | 9e210b487a | |
Burkhart Wolff | 6317294721 | |
Burkhart Wolff | 762680a20c | |
Burkhart Wolff | 850244844b | |
Burkhart Wolff | 322d70ef69 | |
Burkhart Wolff | b04ff7e31a | |
Burkhart Wolff | 7ba220e417 | |
Burkhart Wolff | 713a24615f | |
Burkhart Wolff | 7ffdcbc569 | |
Achim D. Brucker | 43ce393e4a | |
Burkhart Wolff | 4326492b39 | |
Burkhart Wolff | 1e7f6a7c18 | |
Achim D. Brucker | a087e94ebe | |
Achim D. Brucker | 78cb606268 | |
Achim D. Brucker | c40a5a74c1 | |
Achim D. Brucker | fc214fc391 | |
Burkhart Wolff | f613811154 | |
Burkhart Wolff | 4c66716999 | |
Achim D. Brucker | 639abb6cf5 | |
Achim D. Brucker | 2c00f4b8db | |
Burkhart Wolff | d9e2f251d2 | |
Burkhart Wolff | cec21c9935 | |
Achim D. Brucker | 640a867f28 | |
Achim D. Brucker | 0c654e2634 | |
Achim D. Brucker | 01bcc48c79 | |
Achim D. Brucker | c3aaaf9ebb | |
Achim D. Brucker | 47e8fc805f | |
Achim D. Brucker | 02bf9620f6 | |
Nicolas Méric | 18be1ba5f5 | |
Nicolas Méric | 93c722a41b | |
Nicolas Méric | 0f48f356df | |
Achim D. Brucker | 870a4eec57 | |
Achim D. Brucker | 4df233e9f4 | |
Burkhart Wolff | 5d7b50ca7f | |
Burkhart Wolff | 1ebfaccb50 | |
Burkhart Wolff | 7ce3fdf768 | |
Burkhart Wolff | db130bd6ce | |
Achim D. Brucker | 496a850700 | |
Achim D. Brucker | 101f96a261 | |
Achim D. Brucker | 49aa29ee68 | |
Burkhart Wolff | 2919f5d2a5 | |
Burkhart Wolff | 6cafcce536 | |
Burkhart Wolff | ebce149d6a | |
Burkhart Wolff | 6984b9ae03 | |
Burkhart Wolff | 74e2341971 | |
Burkhart Wolff | 16caefc7be | |
Achim D. Brucker | 0d74645d2e | |
Burkhart Wolff | f906d45d48 | |
Burkhart Wolff | 761a336a7a | |
Nicolas Méric | b3f396fb08 | |
Burkhart Wolff | 77aeb3b7ca | |
Burkhart Wolff | 81208f73a8 | |
Burkhart Wolff | f093bfc961 | |
Burkhart Wolff | 2c7df482e8 | |
Burkhart Wolff | c9de5f2293 | |
Nicolas Méric | c6dc848438 | |
Burkhart Wolff | 1acf863845 | |
Burkhart Wolff | a6aca1407e | |
Burkhart Wolff | 4c953fb954 | |
Nicolas Méric | 77e8844687 | |
Nicolas Méric | 939715aba9 | |
Burkhart Wolff | d809211481 | |
Achim D. Brucker | 480272ad86 | |
Achim D. Brucker | d277fa2aed | |
Achim D. Brucker | 9318ea55a0 | |
Achim D. Brucker | 3408b90f89 | |
Burkhart Wolff | dd0a9981a3 | |
Achim D. Brucker | e549bcb23c | |
Achim D. Brucker | 04c8c8d150 | |
Achim D. Brucker | a5885b3eb5 | |
Achim D. Brucker | 4cdb6d725b | |
Achim D. Brucker | 486ae2db97 | |
Burkhart Wolff | fb8da62182 | |
Burkhart Wolff | 6c588c3fe4 | |
Burkhart Wolff | 3ab6f665eb | |
Burkhart Wolff | 0c8bc2cab3 | |
Burkhart Wolff | 20ac16196a | |
Burkhart Wolff | d62cd04e26 | |
Burkhart Wolff | 96d20c127f | |
Burkhart Wolff | 394189e9e0 | |
Burkhart Wolff | 1f79e37d9b | |
Burkhart Wolff | b43de570a4 | |
Burkhart Wolff | debddc45d2 | |
Burkhart Wolff | 3de5548642 | |
Burkhart Wolff | 4157954506 | |
Burkhart Wolff | 25473b177b | |
Nicolas Méric | 36cd3817cf | |
Burkhart Wolff | cb2b0dc230 | |
Burkhart Wolff | c82a3a7e70 | |
Burkhart Wolff | 8c6abf2613 | |
Achim D. Brucker | 07444efd21 | |
Achim D. Brucker | c203327191 | |
Nicolas Méric | a90202953b | |
Achim D. Brucker | 698e6ab169 | |
Achim D. Brucker | 320614004e | |
Burkhart Wolff | 91ff9c67af | |
Burkhart Wolff | 1838baecb9 | |
Nicolas Méric | ef29a9759f | |
Nicolas Méric | 5336e0518f | |
Burkhart Wolff | accc4f40b4 | |
Burkhart Wolff | bbb4b1749c | |
Burkhart Wolff | 4ba0c705b4 | |
Burkhart Wolff | 5d89bcc86a | |
Burkhart Wolff | 07527dbe11 | |
Burkhart Wolff | c0dc60d49e | |
Burkhart Wolff | 81a50c6a9e | |
Burkhart Wolff | 5628eaa2dc | |
Nicolas Méric | 230247de1a | |
Burkhart Wolff | 0834f938a9 | |
Burkhart Wolff | 63c2acfece | |
Burkhart Wolff | 3a4db69184 | |
Burkhart Wolff | 3fc4688f69 | |
Burkhart Wolff | 7dbd016b5d | |
Burkhart Wolff | 3b446c874d | |
Burkhart Wolff | 4de23de5ee | |
Nicolas Méric | 4bd31be71d | |
Nicolas Méric | 826fc489b7 | |
Nicolas Méric | ddcbf76353 | |
Nicolas Méric | 5ad6c0d328 | |
Nicolas Méric | 34d5a194ee | |
Nicolas Méric | 8b09b0c135 | |
Achim D. Brucker | 5292154687 | |
Achim D. Brucker | caf966e3df | |
Achim D. Brucker | 6a1343fd06 | |
Achim D. Brucker | a7db5cc344 | |
Nicolas Méric | de94ef196f | |
Nicolas Méric | c791be2912 | |
Achim D. Brucker | 44528e887d | |
Achim D. Brucker | b3097eaa79 | |
Achim D. Brucker | ecb1e88b78 | |
Achim D. Brucker | 75b39bc168 | |
Nicolas Méric | dde865520a | |
Nicolas Méric | 37afd975b3 | |
Burkhart Wolff | d2a1808fa8 | |
Burkhart Wolff | 94543a86e4 | |
Burkhart Wolff | af096e56fc | |
Burkhart Wolff | 68c1046918 | |
Achim D. Brucker | 1229db1432 | |
Nicolas Méric | 3670d30ddf | |
Burkhart Wolff | 542c38a89c | |
Nicolas Méric | b96302f676 | |
Burkhart Wolff | f60aebccb3 | |
Burkhart Wolff | 224a320165 | |
Nicolas Méric | 92e7ee017a | |
Burkhart Wolff | 8e4ac3f118 | |
Burkhart Wolff | 9fae991ea0 | |
Burkhart Wolff | 6e5fa2d91b | |
Nicolas Méric | b1a0d5d739 | |
Nicolas Méric | 10b90c823f | |
Nicolas Méric | ef8ffda414 | |
Achim D. Brucker | 69485fd497 | |
Achim D. Brucker | f29d888068 | |
Achim D. Brucker | cc805cadbe | |
Achim D. Brucker | 5bf0b00fbc | |
Achim D. Brucker | cc3e6566ca | |
Achim D. Brucker | c297b5cddd | |
Achim D. Brucker | 47c6ce78be | |
Burkhart Wolff | 48c6457f63 | |
Burkhart Wolff | ef3eee03c9 | |
Burkhart Wolff | 853158c916 | |
Burkhart Wolff | 280feb8653 | |
Nicolas Méric | 709187d415 | |
Nicolas Méric | 289d47ee56 | |
Achim D. Brucker | 9c324fde70 | |
Achim D. Brucker | 22abad9026 | |
Nicolas Méric | 40e7285f0a | |
Achim D. Brucker | 3b33166f55 | |
Burkhart Wolff | 0f3beb846e | |
Nicolas Méric | 8e6cb3b991 | |
Achim D. Brucker | baf1d1b629 | |
Achim D. Brucker | de4c7a5168 | |
Achim D. Brucker | 6fe23c16be | |
Achim D. Brucker | 113b3e79bf | |
Achim D. Brucker | daea6333f1 | |
Achim D. Brucker | 53867fb24f | |
Burkhart Wolff | 0f5e7f582b | |
Burkhart Wolff | 0b256adee9 | |
Burkhart Wolff | cbd197e4d8 | |
Burkhart Wolff | 5411aa4d6b | |
Burkhart Wolff | 1895d3b52c | |
Burkhart Wolff | 5bee1fee8f | |
Burkhart Wolff | a64fca4774 | |
Burkhart Wolff | bf4c3d618e | |
Achim D. Brucker | 684a775b07 | |
Achim D. Brucker | 9fe7b26a35 | |
Nicolas Méric | 511c6369dd | |
Achim D. Brucker | 2cb9156488 | |
Achim D. Brucker | ef87b1d81c | |
Nicolas Méric | 5b7a50ba5c | |
Achim D. Brucker | 69808755da | |
Achim D. Brucker | da6bc4277d | |
Achim D. Brucker | 229f7c49de | |
Achim D. Brucker | 3aa1b45837 | |
Achim D. Brucker | 990c6f7708 | |
Achim D. Brucker | 14dd368cd0 | |
Achim D. Brucker | 684e1144bd | |
Achim D. Brucker | 3a39028f1c | |
Achim D. Brucker | ae514aea18 | |
Achim D. Brucker | 9f5473505e | |
Achim D. Brucker | 0c732ec59f | |
Achim D. Brucker | f27150eb88 | |
Achim D. Brucker | bde86a1118 | |
Achim D. Brucker | be2eaab09b | |
Achim D. Brucker | 058324ab5d | |
Achim D. Brucker | 10b4eaf660 | |
Achim D. Brucker | c59858930d | |
Achim D. Brucker | 7ad7c664a3 | |
Achim D. Brucker | dd963a7e09 | |
Achim D. Brucker | 5f88def3be | |
Achim D. Brucker | dfcd00ca73 | |
Achim D. Brucker | e26b4e662e | |
Achim D. Brucker | 02332e8608 | |
Achim D. Brucker | 86152c374b | |
Achim D. Brucker | 233079ef5f | |
Achim D. Brucker | 8389d9ddbe | |
Achim D. Brucker | 85e6cd0372 | |
Achim D. Brucker | 9090772a8a | |
Achim D. Brucker | 070bd363ca | |
Achim D. Brucker | 8e65263093 | |
Achim D. Brucker | acb82477b5 | |
Achim D. Brucker | b90992121e | |
Nicolas Méric | 6a6259bf29 | |
Achim D. Brucker | fb049946c5 | |
Achim D. Brucker | 829915ae2c | |
Achim D. Brucker | 85f115196b | |
Achim D. Brucker | 873f5c79ab | |
Achim D. Brucker | 55f377da39 | |
Achim D. Brucker | 501ea118c2 | |
Achim D. Brucker | a055180b72 | |
Achim D. Brucker | d1c195db26 | |
Achim D. Brucker | 2481603ce1 | |
Achim D. Brucker | b9eeb9e9b8 | |
Achim D. Brucker | fa27d2425e | |
Achim D. Brucker | 8b9c65f6ef | |
Achim D. Brucker | f66b6187f8 | |
Achim D. Brucker | cf386892fc | |
Achim D. Brucker | b0879e98fd | |
Achim D. Brucker | f8399e0fb2 | |
Achim D. Brucker | 0c064b1c8a | |
Achim D. Brucker | 1e0eeea6f9 | |
Achim D. Brucker | 080d867587 | |
Achim D. Brucker | 3e41871b17 | |
Achim D. Brucker | be9ef5a122 | |
Achim D. Brucker | f0fac41148 | |
Achim D. Brucker | 47fa3590aa | |
Achim D. Brucker | fba9ca78e9 | |
Achim D. Brucker | 9287891483 | |
Achim D. Brucker | 30eb47d80c | |
Achim D. Brucker | 00eff9f819 | |
Achim D. Brucker | 73e3cb1098 | |
Achim D. Brucker | 64f4957679 | |
Achim D. Brucker | e4a8ad4227 | |
Achim D. Brucker | 60b1c4f4d4 | |
Achim D. Brucker | de1870fbee | |
Achim D. Brucker | f7b4cf67f7 | |
Achim D. Brucker | 97bf5aa1e3 | |
Achim D. Brucker | d766ac22df | |
Achim D. Brucker | ba90433700 | |
Achim D. Brucker | 762225d20d | |
Achim D. Brucker | aaeb793a51 | |
Achim D. Brucker | 38628c37dc | |
Achim D. Brucker | 43ccaf43f7 | |
Nicolas Méric | 848ce311e2 | |
Nicolas Méric | 6115f0de4a | |
Nicolas Méric | bdfea3ddb1 | |
Nicolas Méric | 9de18b148a | |
Nicolas Méric | 1459b8cfc3 | |
Nicolas Méric | 234ff18ec0 | |
Nicolas Méric | 55690bba33 | |
Nicolas Méric | 93509ab17d | |
Nicolas Méric | 1e09598d81 | |
Nicolas Méric | e01ec9fc21 | |
Nicolas Méric | 7c16d02979 | |
Nicolas Méric | 4a77347e40 | |
Nicolas Méric | 2398fc579a | |
Nicolas Méric | 821eefb230 | |
Nicolas Méric | 9b51844fad | |
Nicolas Méric | c440f9628f | |
Nicolas Méric | 5b3086bbe5 | |
Nicolas Méric | 7c0d2cee55 | |
Nicolas Méric | 7c6150affa | |
Nicolas Méric | ad4ad52b4e | |
Nicolas Méric | ba8227e6ab | |
Nicolas Méric | 20b0af740d | |
Nicolas Méric | 1379f8a671 | |
Achim D. Brucker | 8fdaafa295 | |
Nicolas Méric | 8513f7d267 | |
Nicolas Méric | 2b1a9d009e | |
Nicolas Méric | cd758d2c44 | |
Nicolas Méric | 8496963fec | |
Nicolas Méric | 72d8000f7b | |
Nicolas Méric | 17ec11b297 | |
Nicolas Méric | a96e17abf3 | |
Nicolas Méric | 74b60e47d5 | |
Nicolas Méric | a42dd4ea6c | |
Nicolas Méric | b162a24749 | |
Nicolas Méric | a9432c7b52 | |
Nicolas Méric | 9f28d4949e | |
Nicolas Méric | 885c23a138 | |
Nicolas Méric | a589d4cd47 | |
Burkhart Wolff | e1f143d151 | |
Burkhart Wolff | fd60cf2312 | |
Nicolas Méric | 73dfcd6c1e | |
Nicolas Méric | c0afe1105e | |
Burkhart Wolff | e414b97afb | |
Nicolas Méric | 0b2d28b547 | |
Nicolas Méric | 37d7ed7d17 | |
Nicolas Méric | 312734afbd | |
Burkhart Wolff | 8cee80d78e | |
Makarius Wenzel | ec0d525426 | |
Makarius Wenzel | 791990039b | |
Makarius Wenzel | 78d61390fe | |
Makarius Wenzel | ffcf1f3240 | |
Makarius Wenzel | 5471d873a9 | |
Makarius Wenzel | df37250a00 | |
Makarius Wenzel | 185daeb577 | |
Makarius Wenzel | 8037fd15f2 | |
Makarius Wenzel | afcd78610b | |
Makarius Wenzel | b8a9ef5118 | |
Makarius Wenzel | a4e75c8b12 | |
Makarius Wenzel | d20e9ccd22 | |
Makarius Wenzel | f2ee5d3780 | |
Makarius Wenzel | 44cae2e631 | |
Makarius Wenzel | 7b2bf35353 | |
Makarius Wenzel | e8c7fa6018 | |
Makarius Wenzel | b12e61511d | |
Makarius Wenzel | 3cac42e6cb | |
Makarius Wenzel | aee8ba1df1 | |
Makarius Wenzel | d93e1383d4 | |
Makarius Wenzel | 3d5d1e7476 | |
Makarius Wenzel | 4264e7cd15 | |
Makarius Wenzel | 96f4077c53 | |
Makarius Wenzel | d7fb39d7eb | |
Makarius Wenzel | b95826962f | |
Makarius Wenzel | 912d4bb49e | |
Makarius Wenzel | a6c1a2baa4 | |
Makarius Wenzel | bb5963c6e2 | |
Makarius Wenzel | cc3e2a51a4 | |
Makarius Wenzel | 9e4e5b49eb | |
Makarius Wenzel | b65ecbdbef | |
Makarius Wenzel | 3be2225dcf | |
Makarius Wenzel | f44f0af01c | |
Makarius Wenzel | 9a11baf840 | |
Makarius Wenzel | 48c167aa23 | |
Makarius Wenzel | 700a9bbfee | |
Makarius Wenzel | 73299941ad | |
Makarius Wenzel | 5a8c438c41 | |
Makarius Wenzel | 7772c73aaa | |
Makarius Wenzel | ca18453043 | |
Makarius Wenzel | 1a122b1a87 | |
Makarius Wenzel | 47d95c467e | |
Makarius Wenzel | bf3085d4c0 | |
Makarius Wenzel | 068e6e0411 | |
Makarius Wenzel | 09e9980691 | |
Makarius Wenzel | 94ce3fdec2 | |
Makarius Wenzel | 44819bff02 | |
Makarius Wenzel | a6ab1e101e | |
Makarius Wenzel | c29ec9641a | |
Nicolas Méric | 06833aa190 | |
Nicolas Méric | 4f0c7e1e95 | |
Nicolas Méric | 0040949cf8 | |
Nicolas Méric | e68c332912 | |
Burkhart Wolff | b2c4f40161 | |
Burkhart Wolff | 309952e0ce | |
Burkhart Wolff | 830e1b440a | |
Burkhart Wolff | 2149db9efc | |
Burkhart Wolff | 1547ace64b | |
Burkhart Wolff | 39acd61dfd | |
Burkhart Wolff | 29770b17ee | |
Achim D. Brucker | b4f4048cff |
|
@ -2,3 +2,4 @@ output
|
|||
.afp
|
||||
*~
|
||||
*#
|
||||
Isabelle_DOF-Unit-Tests/latex_test/
|
||||
|
|
|
@ -1,22 +1,29 @@
|
|||
pipeline:
|
||||
build:
|
||||
image: docker.io/logicalhacking/isabelle2022
|
||||
image: git.logicalhacking.com/lh-docker/lh-docker-isabelle/isabelle2023:latest
|
||||
pull: true
|
||||
commands:
|
||||
- hg log --limit 2 /root/isabelle
|
||||
- ./.woodpecker/check_dangling_theories
|
||||
- ./.woodpecker/check_external_file_refs
|
||||
- ./.woodpecker/check_quick_and_dirty
|
||||
- export ARTIFACT_DIR=$CI_WORKSPACE/.artifacts/$CI_REPO/$CI_BRANCH/$CI_BUILD_NUMBER/$LATEX
|
||||
- mkdir -p $ARTIFACT_DIR
|
||||
- export `isabelle getenv ISABELLE_HOME_USER`
|
||||
- mkdir -p $ISABELLE_HOME_USER/etc
|
||||
- echo "ISABELLE_PDFLATEX=\"$LATEX --file-line-error\"" >> $ISABELLE_HOME_USER/etc/settings
|
||||
- isabelle components -u `pwd`
|
||||
- isabelle build -D . -o browser_info
|
||||
- isabelle dof_mkroot DOF_test
|
||||
- isabelle build -x HOL-Proofs -x Isabelle_DOF-Proofs -D . -o browser_info
|
||||
- if [ "$LATEX" = "lualatex" ]; then isabelle build -o 'timeout_scale=2' -D . -o browser_info; else echo "Skipping Isabelle_DOF-Proofs for pdflatex build."; fi
|
||||
- find . -name 'root.tex' -prune -o -name 'output' -type f | xargs latexmk -$LATEX -cd -quiet -Werror
|
||||
- isabelle components -u .
|
||||
- isabelle dof_mkroot -q DOF_test
|
||||
- isabelle build -D DOF_test
|
||||
- cp -r $ISABELLE_HOME_USER/browser_info $ARTIFACT_DIR
|
||||
- cd $ARTIFACT_DIR
|
||||
- cd ../..
|
||||
- ln -s * latest
|
||||
archive:
|
||||
image: docker.io/logicalhacking/isabelle2022
|
||||
image: git.logicalhacking.com/lh-docker/lh-docker-isabelle/isabelle2023:latest
|
||||
commands:
|
||||
- export ARTIFACT_DIR=$CI_WORKSPACE/.artifacts/$CI_REPO/$CI_BRANCH/$CI_BUILD_NUMBER/$LATEX
|
||||
- mkdir -p $ARTIFACT_DIR
|
||||
|
@ -38,7 +45,7 @@ pipeline:
|
|||
from_secret: artifacts_ssh
|
||||
user: artifacts
|
||||
notify:
|
||||
image: drillster/drone-email
|
||||
image: docker.io/drillster/drone-email
|
||||
settings:
|
||||
host: smtp.0x5f.org
|
||||
username: woodpecker
|
||||
|
|
|
@ -0,0 +1,33 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
|
||||
failuremsg="Error"
|
||||
failurecode=1
|
||||
|
||||
while [ $# -gt 0 ]
|
||||
do
|
||||
case "$1" in
|
||||
--warning|-w)
|
||||
failuremsg="Warning"
|
||||
failurecode=0;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
echo "Checking for theories that are not part of an Isabelle session:"
|
||||
echo "==============================================================="
|
||||
|
||||
PWD=`pwd`
|
||||
TMPDIR=`mktemp -d`
|
||||
isabelle build -D . -l -n | grep $PWD | sed -e "s| *${PWD}/||" | sort -u | grep thy$ > ${TMPDIR}/sessions-thy-files.txt
|
||||
find * -type f | sort -u | grep thy$ > ${TMPDIR}/actual-thy-files.txt
|
||||
thylist=`comm -13 ${TMPDIR}/sessions-thy-files.txt ${TMPDIR}/actual-thy-files.txt`
|
||||
if [ -z "$thylist" ] ; then
|
||||
echo " * Success: No dangling theories found."
|
||||
exit 0
|
||||
else
|
||||
echo -e "$thylist"
|
||||
echo "$failuremsg: Dangling theories found (see list above)!"
|
||||
exit $failurecode
|
||||
fi
|
|
@ -0,0 +1,45 @@
|
|||
#!/bin/sh
|
||||
|
||||
|
||||
|
||||
failuremsg="Error"
|
||||
failurecode=1
|
||||
|
||||
while [ $# -gt 0 ]
|
||||
do
|
||||
case "$1" in
|
||||
--warning|-w)
|
||||
failuremsg="Warning"
|
||||
failurecode=0;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
DIRREGEXP="\\.\\./"
|
||||
|
||||
echo "Checking for references pointing outside of session directory:"
|
||||
echo "=============================================================="
|
||||
|
||||
REGEXP=$DIRREGEXP
|
||||
DIR=$DIRMATCH
|
||||
failed=0
|
||||
for i in $(seq 1 10); do
|
||||
FILES=`find * -mindepth $((i-1)) -maxdepth $i -type f | xargs`
|
||||
if [ -n "$FILES" ]; then
|
||||
grep -s ${REGEXP} ${FILES}
|
||||
exit=$?
|
||||
if [ "$exit" -eq 0 ] ; then
|
||||
failed=1
|
||||
fi
|
||||
fi
|
||||
REGEXP="${DIRREGEXP}${REGEXP}"
|
||||
done
|
||||
|
||||
|
||||
if [ "$failed" -ne 0 ] ; then
|
||||
echo "$failuremsg: Forbidden reference to files outside of their session directory!"
|
||||
exit $failurecode
|
||||
fi
|
||||
|
||||
echo " * Success: No relative references to files outside of their session directory found."
|
||||
exit 0
|
|
@ -0,0 +1,30 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
|
||||
failuremsg="Error"
|
||||
failurecode=1
|
||||
|
||||
while [ $# -gt 0 ]
|
||||
do
|
||||
case "$1" in
|
||||
--warning|-w)
|
||||
failuremsg="Warning"
|
||||
failurecode=0;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
echo "Checking for sessions with quick_and_dirty mode enabled:"
|
||||
echo "========================================================"
|
||||
|
||||
rootlist=`find -name 'ROOT' -exec grep -l 'quick_and_dirty *= *true' {} \;`
|
||||
|
||||
if [ -z "$rootlist" ] ; then
|
||||
echo " * Success: No sessions with quick_and_dirty mode enabled found."
|
||||
exit 0
|
||||
else
|
||||
echo -e "$rootlist"
|
||||
echo "$failuremsg: Sessions with quick_and_dirty mode enabled found (see list above)!"
|
||||
exit $failurecode
|
||||
fi
|
|
@ -83,22 +83,22 @@ build_and_install_manuals()
|
|||
if [ "$DIRTY" = "true" ]; then
|
||||
if [ -z ${ARTIFACT_DIR+x} ]; then
|
||||
echo " * Quick and Dirty Mode (local build)"
|
||||
$ISABELLE build -d . Isabelle_DOF-Manual 2018-cicm-isabelle_dof-applications
|
||||
mkdir -p $ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/
|
||||
cp examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/document.pdf \
|
||||
$ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/
|
||||
mkdir -p $ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/
|
||||
cp examples/technical_report/Isabelle_DOF-Manual/output/document.pdf \
|
||||
$ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/;
|
||||
$ISABELLE build -d . Isabelle_DOF Isabelle_DOF-Example-I
|
||||
mkdir -p $ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/
|
||||
cp Isabelle_DOF-Example-I/output/document.pdf \
|
||||
$ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/
|
||||
mkdir -p $ISADOF_WORK_DIR/Isabelle_DOF/output/
|
||||
cp Isabelle_DOF/output/document.pdf \
|
||||
$ISADOF_WORK_DIR/Isabelle_DOF/output/;
|
||||
else
|
||||
echo " * Quick and Dirty Mode (running on CI)"
|
||||
mkdir -p $ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/
|
||||
cp $ARTIFACT_DIR/browser_info/Unsorted/2018-cicm-isabelle_dof-applications/document.pdf \
|
||||
$ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/
|
||||
mkdir -p $ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/
|
||||
cp $ARTIFACT_DIR/browser_info/AFP/Isabelle_DOF-Example-I/document.pdf \
|
||||
$ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/
|
||||
|
||||
mkdir -p $ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/
|
||||
cp $ARTIFACT_DIR/browser_info/Unsorted/Isabelle_DOF-Manual/document.pdf \
|
||||
$ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/;
|
||||
mkdir -p $ISADOF_WORK_DIR/Isabelle_DOF/output/
|
||||
cp $ARTIFACT_DIR/browser_info/AFP/Isabelle_DOF/document.pdf \
|
||||
$ISADOF_WORK_DIR/Isabelle_DOF/output/;
|
||||
fi
|
||||
else
|
||||
(cd $ISADOF_WORK_DIR && $ISABELLE env ./install-afp)
|
||||
|
@ -107,13 +107,13 @@ build_and_install_manuals()
|
|||
mkdir -p $ISADOF_WORK_DIR/doc
|
||||
echo "Isabelle/DOF Manuals!" > $ISADOF_WORK_DIR/doc/Contents
|
||||
|
||||
cp $ISADOF_WORK_DIR/examples/technical_report/Isabelle_DOF-Manual/output/document.pdf \
|
||||
cp $ISADOF_WORK_DIR/Isabelle_DOF/output/document.pdf \
|
||||
$ISADOF_WORK_DIR/doc/Isabelle_DOF-Manual.pdf
|
||||
echo " Isabelle_DOF-Manual User and Implementation Manual for Isabelle/DOF" >> $ISADOF_WORK_DIR/doc/Contents
|
||||
|
||||
cp $ISADOF_WORK_DIR/examples/scholarly_paper/2018-cicm-isabelle_dof-applications/output/document.pdf \
|
||||
$ISADOF_WORK_DIR/doc/2018-cicm-isabelle_dof-applications.pdf
|
||||
echo " 2018-cicm-isabelle_dof-applications Example academic paper" >> $ISADOF_WORK_DIR/doc/Contents
|
||||
cp $ISADOF_WORK_DIR/Isabelle_DOF-Example-I/output/document.pdf \
|
||||
$ISADOF_WORK_DIR/doc/Isabelle_DOF-Example-I.pdf
|
||||
echo " Isabelle_DOF-Example-I Example academic paper" >> $ISADOF_WORK_DIR/doc/Contents
|
||||
|
||||
find $ISADOF_WORK_DIR -type d -name "output" -exec rm -rf {} \; &> /dev/null || true
|
||||
rm -rf $ISADOF_WORK_DIR/.git* $ISADOF_WORK_DIR/.woodpecker $ISADOF_WORK_DIR/.afp
|
||||
|
@ -143,7 +143,6 @@ publish_archive()
|
|||
ssh 0x5f.org chmod go+u-w -R www/$DOF_ARTIFACT_HOST/htdocs/$DOF_ARTIFACT_DIR
|
||||
}
|
||||
|
||||
|
||||
ISABELLE=`which isabelle`
|
||||
USE_TAG="false"
|
||||
SIGN="false"
|
||||
|
@ -194,8 +193,8 @@ for i in $VARS; do
|
|||
export "$i"
|
||||
done
|
||||
|
||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL options -g dof_isabelle)"
|
||||
DOF_VERSION="$($ISABELLE_TOOL options -g dof_version)"
|
||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
|
||||
DOF_VERSION="$($ISABELLE_TOOL dof_param -b dof_version)"
|
||||
|
||||
ISABELLE_SHORT_VERSION=`echo $ISABELLE_VERSION | sed -e 's/:.*$//'`
|
||||
ISADOF_TAR="Isabelle_DOF-"$DOF_VERSION"_"$ISABELLE_SHORT_VERSION
|
||||
|
@ -221,4 +220,3 @@ fi
|
|||
|
||||
rm -rf $BUILD_DIR
|
||||
|
||||
exit 0
|
||||
|
|
|
@ -11,7 +11,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
|
|||
|
||||
### Changed
|
||||
|
||||
- Updated Isabelle version to Isabelle 2022
|
||||
- Updated Isabelle version to Isabelle 2023
|
||||
|
||||
## [1.3.0] - 2022-07-08
|
||||
|
||||
|
|
|
@ -16,6 +16,9 @@ theory IsaDofApplications
|
|||
imports "Isabelle_DOF.scholarly_paper"
|
||||
begin
|
||||
|
||||
use_template "lncs"
|
||||
use_ontology "Isabelle_DOF.scholarly_paper"
|
||||
|
||||
open_monitor*[this::article]
|
||||
declare[[strict_monitor_checking=false]]
|
||||
|
||||
|
@ -27,6 +30,61 @@ define_shortcut* isadof \<rightleftharpoons> \<open>\isadof\<close>
|
|||
|
||||
(* slanted text in contrast to italics *)
|
||||
define_macro* slanted_text \<rightleftharpoons> \<open>\textsl{\<close> _ \<open>}\<close>
|
||||
define_macro* unchecked_label \<rightleftharpoons> \<open>\autoref{\<close> _ \<open>}\<close>
|
||||
|
||||
ML\<open>
|
||||
|
||||
fun boxed_text_antiquotation name (* redefined in these more abstract terms *) =
|
||||
DOF_lib.gen_text_antiquotation name DOF_lib.report_text
|
||||
(fn ctxt => DOF_lib.string_2_text_antiquotation ctxt
|
||||
#> DOF_lib.enclose_env false ctxt "isarbox")
|
||||
|
||||
val neant = K(Latex.text("",\<^here>))
|
||||
|
||||
fun boxed_theory_text_antiquotation name (* redefined in these more abstract terms *) =
|
||||
DOF_lib.gen_text_antiquotation name DOF_lib.report_theory_text
|
||||
(fn ctxt => DOF_lib.string_2_theory_text_antiquotation ctxt
|
||||
#> DOF_lib.enclose_env false ctxt "isarbox"
|
||||
(* #> neant *)) (*debugging *)
|
||||
|
||||
fun boxed_sml_text_antiquotation name =
|
||||
DOF_lib.gen_text_antiquotation name (K(K()))
|
||||
(fn ctxt => Input.source_content
|
||||
#> Latex.text
|
||||
#> DOF_lib.enclose_env true ctxt "sml")
|
||||
(* the simplest conversion possible *)
|
||||
|
||||
fun boxed_pdf_antiquotation name =
|
||||
DOF_lib.gen_text_antiquotation name (K(K()))
|
||||
(fn ctxt => Input.source_content
|
||||
#> Latex.text
|
||||
#> DOF_lib.enclose_env true ctxt "out")
|
||||
(* the simplest conversion possible *)
|
||||
|
||||
fun boxed_latex_antiquotation name =
|
||||
DOF_lib.gen_text_antiquotation name (K(K()))
|
||||
(fn ctxt => Input.source_content
|
||||
#> Latex.text
|
||||
#> DOF_lib.enclose_env true ctxt "ltx")
|
||||
(* the simplest conversion possible *)
|
||||
|
||||
fun boxed_bash_antiquotation name =
|
||||
DOF_lib.gen_text_antiquotation name (K(K()))
|
||||
(fn ctxt => Input.source_content
|
||||
#> Latex.text
|
||||
#> DOF_lib.enclose_env true ctxt "bash")
|
||||
(* the simplest conversion possible *)
|
||||
\<close>
|
||||
|
||||
setup\<open>boxed_text_antiquotation \<^binding>\<open>boxed_text\<close> #>
|
||||
boxed_text_antiquotation \<^binding>\<open>boxed_cartouche\<close> #>
|
||||
boxed_theory_text_antiquotation \<^binding>\<open>boxed_theory_text\<close> #>
|
||||
|
||||
boxed_sml_text_antiquotation \<^binding>\<open>boxed_sml\<close> #>
|
||||
boxed_pdf_antiquotation \<^binding>\<open>boxed_pdf\<close> #>
|
||||
boxed_latex_antiquotation \<^binding>\<open>boxed_latex\<close>#>
|
||||
boxed_bash_antiquotation \<^binding>\<open>boxed_bash\<close>
|
||||
\<close>
|
||||
|
||||
(*>*)
|
||||
|
||||
|
@ -71,7 +129,7 @@ abstract*[abs::abstract, keywordlist="[''Ontology'',''Ontological Modeling'',''I
|
|||
\<close>
|
||||
|
||||
section*[intro::introduction]\<open> Introduction \<close>
|
||||
text*[introtext::introduction]\<open>
|
||||
text*[introtext::introduction, level = "Some 1"]\<open>
|
||||
The linking of the \<^emph>\<open>formal\<close> to the \<^emph>\<open>informal\<close> is perhaps the
|
||||
most pervasive challenge in the digitization of knowledge and its
|
||||
propagation. This challenge incites numerous research efforts
|
||||
|
@ -99,20 +157,18 @@ document evolution. Based on Isabelle infrastructures, ontologies may refer to
|
|||
types, terms, proven theorems, code, or established assertions.
|
||||
Based on a novel adaption of the Isabelle IDE, a document is checked to be
|
||||
\<^emph>\<open>conform\<close> to a particular ontology---\<^isadof> is designed to give fast user-feedback
|
||||
\<^emph>\<open>during the capture of content\<close>. This is particularly valuable in case of document
|
||||
\<^emph>\<open>during the capture of content\<close>. This is particularly valuable for document
|
||||
changes, where the \<^emph>\<open>coherence\<close> between the formal and the informal parts of the
|
||||
content can be mechanically checked.
|
||||
|
||||
To avoid any misunderstanding: \<^isadof> is \<^emph>\<open>not a theory in HOL\<close>
|
||||
on ontologies and operations to track and trace links in texts,
|
||||
it is an \<^emph>\<open>environment to write structured text\<close> which \<^emph>\<open>may contain\<close>
|
||||
\<^isabelle> definitions and proofs like mathematical articles, tech-reports and
|
||||
scientific papers---as the present one, which is written in \<^isadof>
|
||||
itself. \<^isadof> is a plugin into the Isabelle/Isar
|
||||
framework in the style of~@{cite "wenzel.ea:building:2007"}.
|
||||
To avoid any misunderstanding: \<^isadof> is \<^emph>\<open>not a theory in HOL\<close> on ontologies and operations
|
||||
to track and trace links in texts, it is an \<^emph>\<open>environment to write structured text\<close> which
|
||||
\<^emph>\<open>may contain\<close> \<^isabelle> definitions and proofs like mathematical articles, tech-reports and
|
||||
scientific papers---as the present one, which is written in \<^isadof> itself. \<^isadof> is a plugin
|
||||
into the Isabelle/Isar framework in the style of~@{cite "wenzel.ea:building:2007"}.
|
||||
\<close>
|
||||
|
||||
(* declaring the forward references used in the subsequent section *)
|
||||
(* declaring the forward references used in the subsequent sections *)
|
||||
(*<*)
|
||||
declare_reference*[bgrnd::text_section]
|
||||
declare_reference*[isadof::text_section]
|
||||
|
@ -120,29 +176,25 @@ declare_reference*[ontomod::text_section]
|
|||
declare_reference*[ontopide::text_section]
|
||||
declare_reference*[conclusion::text_section]
|
||||
(*>*)
|
||||
text*[plan::introduction]\<open> The plan of the paper is follows: we start by introducing the underlying
|
||||
Isabelle system (@{text_section (unchecked) \<open>bgrnd\<close>}) followed by presenting the
|
||||
essentials of \<^isadof> and its ontology language (@{text_section (unchecked) \<open>isadof\<close>}).
|
||||
text*[plan::introduction, level="Some 1"]\<open> The plan of the paper is as follows: we start by
|
||||
introducing the underlying Isabelle system (@{text_section (unchecked) \<open>bgrnd\<close>}) followed by
|
||||
presenting the essentials of \<^isadof> and its ontology language (@{text_section (unchecked) \<open>isadof\<close>}).
|
||||
It follows @{text_section (unchecked) \<open>ontomod\<close>}, where we present three application
|
||||
scenarios from the point of view of the ontology modeling. In @{text_section (unchecked) \<open>ontopide\<close>}
|
||||
we discuss the user-interaction generated from the ontological definitions. Finally, we draw
|
||||
conclusions and discuss related work in @{text_section (unchecked) \<open>conclusion\<close>}. \<close>
|
||||
|
||||
section*[bgrnd::text_section,main_author="Some(@{docitem ''bu''}::author)"]
|
||||
section*[bgrnd::text_section,main_author="Some(@{author ''bu''}::author)"]
|
||||
\<open> Background: The Isabelle System \<close>
|
||||
text*[background::introduction]\<open>
|
||||
While Isabelle is widely perceived as an interactive theorem prover
|
||||
for HOL (Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we
|
||||
would like to emphasize the view that Isabelle is far more than that:
|
||||
it is the \<^emph>\<open>Eclipse of Formal Methods Tools\<close>. This refers to the
|
||||
``\<^slanted_text>\<open>generic system framework of Isabelle/Isar underlying recent
|
||||
versions of Isabelle. Among other things, Isar provides an
|
||||
infrastructure for Isabelle plug-ins, comprising extensible state
|
||||
components and extensible syntax that can be bound to ML
|
||||
programs. Thus, the Isabelle/Isar architecture may be understood as
|
||||
an extension and refinement of the traditional `LCF approach', with
|
||||
explicit infrastructure for building derivative
|
||||
\<^emph>\<open>systems\<close>.\<close>''~@{cite "wenzel.ea:building:2007"}
|
||||
text*[background::introduction, level="Some 1"]\<open>
|
||||
While Isabelle is widely perceived as an interactive theorem prover for HOL
|
||||
(Higher-order Logic)~@{cite "nipkow.ea:isabelle:2002"}, we would like to emphasize the view that
|
||||
Isabelle is far more than that: it is the \<^emph>\<open>Eclipse of Formal Methods Tools\<close>. This refers to the
|
||||
``\<^slanted_text>\<open>generic system framework of Isabelle/Isar underlying recent versions of Isabelle.
|
||||
Among other things, Isar provides an infrastructure for Isabelle plug-ins, comprising extensible
|
||||
state components and extensible syntax that can be bound to ML programs. Thus, the Isabelle/Isar
|
||||
architecture may be understood as an extension and refinement of the traditional `LCF approach',
|
||||
with explicit infrastructure for building derivative \<^emph>\<open>systems\<close>.\<close>''~@{cite "wenzel.ea:building:2007"}
|
||||
|
||||
The current system framework offers moreover the following features:
|
||||
|
||||
|
@ -154,12 +206,12 @@ The current system framework offers moreover the following features:
|
|||
the most prominent and deeply integrated system component.
|
||||
\<close>
|
||||
|
||||
figure*[architecture::figure,relative_width="100",src="''figures/isabelle-architecture''"]\<open>
|
||||
figure*[architecture::figure,relative_width="100",file_src="''figures/isabelle-architecture.pdf''"]\<open>
|
||||
The system architecture of Isabelle (left-hand side) and the
|
||||
asynchronous communication between the Isabelle system and
|
||||
the IDE (right-hand side). \<close>
|
||||
|
||||
text*[blug::introduction]\<open> The Isabelle system architecture shown in @{figure \<open>architecture\<close>}
|
||||
text*[blug::introduction, level="Some 1"]\<open> The Isabelle system architecture shown in @{figure \<open>architecture\<close>}
|
||||
comes with many layers, with Standard ML (SML) at the bottom layer as implementation
|
||||
language. The architecture actually foresees a \<^emph>\<open>Nano-Kernel\<close> (our terminology) which
|
||||
resides in the SML structure \<^ML_structure>\<open>Context\<close>. This structure provides a kind of container called
|
||||
|
@ -169,41 +221,39 @@ automated proof procedures as well as specific support for higher specification
|
|||
were built. \<close>
|
||||
|
||||
text\<open> We would like to detail the documentation generation of the architecture,
|
||||
which is based on literate specification commands such as \inlineisar+section+ \<^dots>,
|
||||
\inlineisar+subsection+ \<^dots>, \inlineisar+text+ \<^dots>, etc.
|
||||
which is based on literate specification commands such as \<^theory_text>\<open>section\<close> \<^dots>,
|
||||
\<^theory_text>\<open>subsection\<close> \<^dots>, \<^theory_text>\<open>text\<close> \<^dots>, etc.
|
||||
Thus, a user can add a simple text:
|
||||
\begin{isar}
|
||||
text\<Open>This is a description.\<Close>
|
||||
\end{isar}
|
||||
@{boxed_theory_text [display]\<open>
|
||||
text\<open> This is a description.\<close>\<close>}
|
||||
These text-commands can be arbitrarily mixed with other commands stating definitions, proofs, code, etc.,
|
||||
and will result in the corresponding output in generated \<^LaTeX> or HTML documents.
|
||||
Now, \<^emph>\<open>inside\<close> the textual content, it is possible to embed a \<^emph>\<open>text-antiquotation\<close>:
|
||||
\begin{isar}
|
||||
text\<Open>According to the reflexivity axiom \at{thm refl}, we obtain in \<Gamma>
|
||||
for \at{term "fac 5"} the result \at{value "fac 5"}.\<Close>
|
||||
\end{isar}
|
||||
@{boxed_theory_text [display]\<open>
|
||||
text\<open> According to the \<^emph>\<open>reflexivity\<close> axiom @{thm refl},
|
||||
we obtain in \<Gamma> for @{term "fac 5"} the result @{value "fac 5"}.\<close>\<close>}
|
||||
|
||||
which is represented in the generated output by:
|
||||
\begin{out}
|
||||
According to the reflexivity axiom $x = x$, we obtain in $\Gamma$ for $\operatorname{fac} 5$ the result $120$.
|
||||
\end{out}
|
||||
where \inlineisar+refl+ is actually the reference to the axiom of reflexivity in HOL.
|
||||
For the antiquotation \inlineisar+\at{value "fac 5"}+ we assume the usual definition for
|
||||
\inlineisar+fac+ in HOL.
|
||||
@{boxed_pdf [display]\<open>According to the reflexivity axiom $x = x$, we obtain in $\Gamma$ for $\operatorname{fac} 5$ the result $120$.\<close>}
|
||||
|
||||
where \<^theory_text>\<open>refl\<close> is actually the reference to the axiom of reflexivity in HOL.
|
||||
For the antiquotation \<^theory_text>\<open>@{value "''fac 5''"}\<close> we assume the usual definition for
|
||||
\<^theory_text>\<open>fac\<close> in HOL.
|
||||
\<close>
|
||||
|
||||
text*[anti]\<open> Thus, antiquotations can refer to formal content, can be type-checked before being
|
||||
displayed and can be used for calculations before actually being typeset. When editing,
|
||||
Isabelle's PIDE offers auto-completion and error-messages while typing the above
|
||||
\<^emph>\<open>semi-formal\<close> content. \<close>
|
||||
text*[anti::introduction, level = "Some 1"]\<open> Thus, antiquotations can refer to formal content,
|
||||
can be type-checked before being displayed and can be used for calculations before actually being
|
||||
typeset. When editing, Isabelle's PIDE offers auto-completion and error-messages while typing the
|
||||
above \<^emph>\<open>semi-formal\<close> content.\<close>
|
||||
|
||||
section*[isadof::technical,main_author="Some(@{docitem ''adb''}::author)"]\<open> \<^isadof> \<close>
|
||||
section*[isadof::technical,main_author="Some(@{author ''adb''}::author)"]\<open> \<^isadof> \<close>
|
||||
|
||||
text\<open> An \<^isadof> document consists of three components:
|
||||
\<^item> the \<^emph>\<open>ontology definition\<close> which is an Isabelle theory file with definitions
|
||||
for document-classes and all auxiliary datatypes.
|
||||
\<^item> the \<^emph>\<open>core\<close> of the document itself which is an Isabelle theory
|
||||
importing the ontology definition. \<^isadof> provides an own family of text-element
|
||||
commands such as \inlineisar+title*+, \inlineisar+section*+, \inlineisar+text*+, etc.,
|
||||
commands such as \<^theory_text>\<open>title*\<close>, \<^theory_text>\<open>section*\<close>, \<^theory_text>\<open>text*\<close>, etc.,
|
||||
which can be annotated with meta-information defined in the underlying ontology definition.
|
||||
\<^item> the \<^emph>\<open>layout definition\<close> for the given ontology exploiting this meta-information.
|
||||
\<close>
|
||||
|
@ -212,7 +262,7 @@ three parts. Note that the document core \<^emph>\<open>may\<close>, but \<^emph
|
|||
use Isabelle definitions or proofs for checking the formal content---the
|
||||
present paper is actually an example of a document not containing any proof.
|
||||
|
||||
The document generation process of \<^isadof> is currently restricted to \LaTeX, which means
|
||||
The document generation process of \<^isadof> is currently restricted to \<^LaTeX>, which means
|
||||
that the layout is defined by a set of \<^LaTeX> style files. Several layout
|
||||
definitions for one ontology are possible and pave the way that different \<^emph>\<open>views\<close> for
|
||||
the same central document were generated, addressing the needs of different purposes `
|
||||
|
@ -226,65 +276,47 @@ style-files (\<^verbatim>\<open>.sty\<close>-files). In the document core author
|
|||
their source, but this limits the possibility of using different representation technologies,
|
||||
\<^eg>, HTML, and increases the risk of arcane error-messages in generated \<^LaTeX>.
|
||||
|
||||
The \<^isadof> ontology specification language consists basically on a notation for
|
||||
document classes, where the attributes were typed with HOL-types and can be instantiated
|
||||
by terms HOL-terms, \<^ie>, the actual parsers and type-checkers of the Isabelle system were reused.
|
||||
This has the particular advantage that \<^isadof> commands can be arbitrarily mixed with
|
||||
Isabelle/HOL commands providing the machinery for type declarations and term specifications such
|
||||
as enumerations. In particular, document class definitions provide:
|
||||
The \<^isadof> ontology specification language consists basically on a notation for document classes,
|
||||
where the attributes were typed with HOL-types and can be instantiated by terms HOL-terms, \<^ie>,
|
||||
the actual parsers and type-checkers of the Isabelle system were reused. This has the particular
|
||||
advantage that \<^isadof> commands can be arbitrarily mixed with Isabelle/HOL commands providing the
|
||||
machinery for type declarations and term specifications such as enumerations. In particular,
|
||||
document class definitions provide:
|
||||
\<^item> a HOL-type for each document class as well as inheritance,
|
||||
\<^item> support for attributes with HOL-types and optional default values,
|
||||
\<^item> support for overriding of attribute defaults but not overloading, and
|
||||
\<^item> text-elements annotated with document classes; they are mutable
|
||||
instances of document classes.
|
||||
\<close>
|
||||
instances of document classes.\<close>
|
||||
|
||||
text\<open>
|
||||
Attributes referring to other ontological concepts are called \<^emph>\<open>links\<close>.
|
||||
The HOL-types inside the document specification language support built-in types for Isabelle/HOL
|
||||
\inlineisar+typ+'s, \inlineisar+term+'s, and \inlineisar+thm+'s reflecting internal Isabelle's
|
||||
internal types for these entities; when denoted in HOL-terms to instantiate an attribute, for
|
||||
example, there is a specific syntax (called \<^emph>\<open>inner syntax antiquotations\<close>) that is checked by
|
||||
\<^isadof> for consistency.
|
||||
Attributes referring to other ontological concepts are called \<^emph>\<open>links\<close>. The HOL-types inside the
|
||||
document specification language support built-in types for Isabelle/HOL \<^theory_text>\<open>typ\<close>'s, \<^theory_text>\<open>term\<close>'s, and
|
||||
\<^theory_text>\<open>thm\<close>'s reflecting internal Isabelle's internal types for these entities; when denoted in
|
||||
HOL-terms to instantiate an attribute, for example, there is a specific syntax
|
||||
(called \<^emph>\<open>inner syntax antiquotations\<close>) that is checked by \<^isadof> for consistency.
|
||||
|
||||
Document classes can have a \inlineisar+where+ clause containing a regular
|
||||
expression over class names. Classes with such a \inlineisar+where+ were called \<^emph>\<open>monitor classes\<close>.
|
||||
While document classes and their inheritance relation structure meta-data of text-elements
|
||||
in an object-oriented manner, monitor classes enforce structural organization
|
||||
of documents via the language specified by the regular expression
|
||||
enforcing a sequence of text-elements that must belong to the corresponding classes.
|
||||
|
||||
To start using \<^isadof>, one creates an Isabelle project (with the name
|
||||
\inlinebash{IsaDofApplications}):
|
||||
\begin{bash}
|
||||
isabelle dof_mkroot -o scholarly_paper -t lncs IsaDofApplications
|
||||
\end{bash}
|
||||
where the \inlinebash{-o scholarly_paper} specifies the ontology for writing scientific articles and
|
||||
\inlinebash{-t lncs} specifies the use of Springer's \LaTeX-configuration for the Lecture Notes in
|
||||
Computer Science series. The project can be formally checked, including the generation of the
|
||||
article in PDF using the following command:
|
||||
\begin{bash}
|
||||
isabelle build -d . IsaDofApplications
|
||||
\end{bash}
|
||||
\<close>
|
||||
Document classes can have a \<^theory_text>\<open>where\<close> clause containing a regular expression over class names.
|
||||
Classes with such a \<^theory_text>\<open>where\<close> were called \<^emph>\<open>monitor classes\<close>. While document classes and their
|
||||
inheritance relation structure meta-data of text-elements in an object-oriented manner, monitor
|
||||
classes enforce structural organization of documents via the language specified by the regular
|
||||
expression enforcing a sequence of text-elements that belong to the corresponding classes. \<^vs>\<open>-0.4cm\<close>\<close>
|
||||
|
||||
section*[ontomod::text_section]\<open> Modeling Ontologies in \<^isadof> \<close>
|
||||
text\<open> In this section, we will use the \<^isadof> document ontology language
|
||||
for three different application scenarios: for scholarly papers, for mathematical
|
||||
exam sheets as well as standardization documents where the concepts of the
|
||||
standard are captured in the ontology. For space reasons, we will concentrate in all three
|
||||
cases on aspects of the modeling due to space limitations.\<close>
|
||||
text\<open> In this section, we will use the \<^isadof> document ontology language for three different
|
||||
application scenarios: for scholarly papers, for mathematical exam sheets as well as standardization
|
||||
documents where the concepts of the standard are captured in the ontology. For space reasons, we
|
||||
will concentrate in all three cases on aspects of the modeling due to space limitations.\<close>
|
||||
|
||||
subsection*[scholar_onto::example]\<open> The Scholar Paper Scenario: Eating One's Own Dog Food. \<close>
|
||||
text\<open> The following ontology is a simple ontology modeling scientific papers. In this
|
||||
\<^isadof> application scenario, we deliberately refrain from integrating references to
|
||||
(Isabelle) formal content in order demonstrate that \<^isadof> is not a framework from
|
||||
Isabelle users to Isabelle users only.
|
||||
Of course, such references can be added easily and represent a particular strength
|
||||
of \<^isadof>.
|
||||
Isabelle users to Isabelle users only. Of course, such references can be added easily and
|
||||
represent a particular strength of \<^isadof>.\<close>
|
||||
|
||||
|
||||
\begin{figure}
|
||||
\begin{isar}
|
||||
text*["paper_onto_core"::float,
|
||||
main_caption="\<open>The core of the ontology definition for writing scholarly papers.\<close>"]
|
||||
\<open>@{boxed_theory_text [display]\<open>
|
||||
doc_class title =
|
||||
short_title :: "string option" <= None
|
||||
|
||||
|
@ -299,64 +331,62 @@ doc_class abstract =
|
|||
|
||||
doc_class text_section =
|
||||
main_author :: "author option" <= None
|
||||
todo_list :: "string list" <= "[]"
|
||||
\end{isar}
|
||||
\caption{The core of the ontology definition for writing scholarly papers.}
|
||||
\label{fig:paper-onto-core}
|
||||
\end{figure}
|
||||
The first part of the ontology \inlineisar+scholarly_paper+ (see \autoref{fig:paper-onto-core})
|
||||
todo_list :: "string list" <= "[]"
|
||||
\<close>}\<close>
|
||||
|
||||
text\<open> The first part of the ontology \<^theory_text>\<open>scholarly_paper\<close>
|
||||
(see @{float "paper_onto_core"})
|
||||
contains the document class definitions
|
||||
with the usual text-elements of a scientific paper. The attributes \inlineisar+short_title+,
|
||||
\inlineisar+abbrev+ etc are introduced with their types as well as their default values.
|
||||
Our model prescribes an optional \inlineisar+main_author+ and a todo-list attached to an arbitrary
|
||||
with the usual text-elements of a scientific paper. The attributes \<^theory_text>\<open>short_title\<close>,
|
||||
\<^theory_text>\<open>abbrev\<close> etc are introduced with their types as well as their default values.
|
||||
Our model prescribes an optional \<^theory_text>\<open>main_author\<close> and a todo-list attached to an arbitrary
|
||||
text section; since instances of this class are mutable (meta)-objects of text-elements, they
|
||||
can be modified arbitrarily through subsequent text and of course globally during text evolution.
|
||||
Since \inlineisar+author+ is a HOL-type internally generated by \<^isadof> framework and can therefore
|
||||
appear in the \inlineisar+main_author+ attribute of the \inlineisar+text_section+ class;
|
||||
Since \<^theory_text>\<open>author\<close> is a HOL-type internally generated by \<^isadof> framework and can therefore
|
||||
appear in the \<^theory_text>\<open>main_author\<close> attribute of the \<^theory_text>\<open>text_section\<close> class;
|
||||
semantic links between concepts can be modeled this way.
|
||||
|
||||
The translation of its content to, \<^eg>, Springer's \<^LaTeX> setup for the Lecture Notes in Computer
|
||||
Science Series, as required by many scientific conferences, is mostly straight-forward. \<close>
|
||||
Science Series, as required by many scientific conferences, is mostly straight-forward.
|
||||
\<^vs>\<open>-0.8cm\<close>\<close>
|
||||
|
||||
figure*[fig1::figure,spawn_columns=False,relative_width="95",src="''figures/Dogfood-Intro''"]
|
||||
figure*[fig1::figure,relative_width="95",file_src="''figures/Dogfood-Intro.png''"]
|
||||
\<open> Ouroboros I: This paper from inside \<^dots> \<close>
|
||||
|
||||
text\<open> @{figure \<open>fig1\<close>} shows the corresponding view in the Isabelle/PIDE of thqqe present paper.
|
||||
(*<*)declare_reference*[paper_onto_sections::float](*>*)
|
||||
text\<open>\<^vs>\<open>-0.8cm\<close> @{figure \<open>fig1\<close>} shows the corresponding view in the Isabelle/PIDE of the present paper.
|
||||
Note that the text uses \<^isadof>'s own text-commands containing the meta-information provided by
|
||||
the underlying ontology.
|
||||
We proceed by a definition of \inlineisar+introduction+'s, which we define as the extension of
|
||||
\inlineisar+text_section+ which is intended to capture common infrastructure:
|
||||
\begin{isar}
|
||||
We proceed by a definition of \<^theory_text>\<open>introduction\<close>'s, which we define as the extension of
|
||||
\<^theory_text>\<open>text_section\<close> which is intended to capture common infrastructure:
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class introduction = text_section +
|
||||
comment :: string
|
||||
\end{isar}
|
||||
As a consequence of the definition as extension, the \inlineisar+introduction+ class
|
||||
inherits the attributes \inlineisar+main_author+ and \inlineisar+todo_list+ together with
|
||||
\<close>}
|
||||
As a consequence of the definition as extension, the \<^theory_text>\<open>introduction\<close> class
|
||||
inherits the attributes \<^theory_text>\<open>main_author\<close> and \<^theory_text>\<open>todo_list\<close> together with
|
||||
the corresponding default values.
|
||||
|
||||
As a variant of the introduction, we could add here an attribute that contains the formal
|
||||
claims of the article --- either here, or, for example, in the keyword list of the abstract.
|
||||
As type, one could use either the built-in type \inlineisar+term+ (for syntactically correct,
|
||||
but not necessarily proven entity) or \inlineisar+thm+ (for formally proven entities). It suffices
|
||||
As type, one could use either the built-in type \<^theory_text>\<open>term\<close> (for syntactically correct,
|
||||
but not necessarily proven entity) or \<^theory_text>\<open>thm\<close> (for formally proven entities). It suffices
|
||||
to add the line:
|
||||
\begin{isar}
|
||||
@{boxed_theory_text [display]\<open>
|
||||
claims :: "thm list"
|
||||
\end{isar}
|
||||
and to extent the \LaTeX-style accordingly to handle the additional field.
|
||||
Note that \inlineisar+term+ and \inlineisar+thm+ are types reflecting the core-types of the
|
||||
\<close>}
|
||||
and to extent the \<^LaTeX>-style accordingly to handle the additional field.
|
||||
Note that \<^theory_text>\<open>term\<close> and \<^theory_text>\<open>thm\<close> are types reflecting the core-types of the
|
||||
Isabelle kernel. In a corresponding conclusion section, one could model analogously an
|
||||
achievement section; by programming a specific compliance check in SML, the implementation
|
||||
of automated forms of validation check for specific categories of papers is envisageable.
|
||||
Since this requires deeper knowledge in Isabelle programming, however, we consider this out
|
||||
of the scope of this paper.
|
||||
|
||||
|
||||
We proceed more or less conventionally by the subsequent sections (\autoref{fig:paper-onto-sections})
|
||||
\begin{figure}
|
||||
\begin{isar}
|
||||
doc_class technical = text_section +
|
||||
definition_list :: "string list" <= "[]"
|
||||
|
||||
We proceed more or less conventionally by the subsequent sections (@{float (unchecked)\<open>paper_onto_sections\<close>})\<close>
|
||||
text*["paper_onto_sections"::float,
|
||||
main_caption = "''Various types of sections of a scholarly papers.''"]\<open>
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class example = text_section +
|
||||
comment :: string
|
||||
|
||||
|
@ -368,14 +398,13 @@ doc_class related_work = conclusion +
|
|||
|
||||
doc_class bibliography =
|
||||
style :: "string option" <= "''LNCS''"
|
||||
\end{isar}
|
||||
\caption{Various types of sections of a scholarly papers.}
|
||||
\label{fig:paper-onto-sections}
|
||||
\end{figure}
|
||||
and finish with a monitor class definition that enforces a textual ordering
|
||||
in the document core by a regular expression (\autoref{fig:paper-onto-monitor}).
|
||||
\begin{figure}
|
||||
\begin{isar}
|
||||
\<close>}\<close>
|
||||
(*<*)declare_reference*[paper_onto_monitor::float](*>*)
|
||||
text\<open>... and finish with a monitor class definition that enforces a textual ordering
|
||||
in the document core by a regular expression (@{float (unchecked) "paper_onto_monitor"}).\<close>
|
||||
text*["paper_onto_monitor"::float,
|
||||
main_caption = "''A monitor for the scholarly paper ontology.''"]\<open>
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class article =
|
||||
trace :: "(title + subtitle + author+ abstract +
|
||||
introduction + technical + example +
|
||||
|
@ -383,23 +412,20 @@ doc_class article =
|
|||
where "(title ~~ \<lbrakk>subtitle\<rbrakk> ~~ \<lbrace>author\<rbrace>$^+$+ ~~ abstract ~~
|
||||
introduction ~~ \<lbrace>technical || example\<rbrace>$^+$ ~~ conclusion ~~
|
||||
bibliography)"
|
||||
\end{isar}
|
||||
\caption{A monitor for the scholarly paper ontology.}
|
||||
\label{fig:paper-onto-monitor}
|
||||
\end{figure}
|
||||
\<close>}
|
||||
\<close>
|
||||
text\<open> We might wish to add a component into our ontology that models figures to be included into
|
||||
the document. This boils down to the exercise of modeling structured data in the style of a
|
||||
functional programming language in HOL and to reuse the implicit HOL-type inside a suitable document
|
||||
class \inlineisar+figure+:
|
||||
\begin{isar}
|
||||
class \<^theory_text>\<open>figure\<close>:
|
||||
@{boxed_theory_text [display]\<open>
|
||||
datatype placement = h | t | b | ht | hb
|
||||
doc_class figure = text_section +
|
||||
relative_width :: "int" (* percent of textwidth *)
|
||||
src :: "string"
|
||||
placement :: placement
|
||||
spawn_columns :: bool <= True
|
||||
\end{isar}
|
||||
\<close>}
|
||||
\<close>
|
||||
|
||||
text\<open> Alternatively, by including the HOL-libraries for rationals, it is possible to
|
||||
|
@ -407,11 +433,11 @@ use fractions or even mathematical reals. This must be counterbalanced by syntac
|
|||
and semantic convenience. Choosing the mathematical reals, \<^eg>, would have the drawback that
|
||||
attribute evaluation could be substantially more complicated.\<close>
|
||||
|
||||
figure*[fig_figures::figure,spawn_columns=False,relative_width="85",src="''figures/Dogfood-figures''"]
|
||||
figure*[fig_figures::figure,relative_width="85",file_src="''figures/Dogfood-figures.png''"]
|
||||
\<open> Ouroboros II: figures \<^dots> \<close>
|
||||
|
||||
text\<open> The document class \inlineisar+figure+ --- supported by the \<^isadof> text command
|
||||
\inlineisar+figure*+ --- makes it possible to express the pictures and diagrams in this paper
|
||||
text\<open> The document class \<^theory_text>\<open>figure\<close> --- supported by the \<^isadof> text command
|
||||
\<^theory_text>\<open>figure*\<close> --- makes it possible to express the pictures and diagrams in this paper
|
||||
such as @{figure \<open>fig_figures\<close>}.
|
||||
\<close>
|
||||
|
||||
|
@ -434,10 +460,10 @@ We assume that the content has four different types of addressees, which have a
|
|||
text\<open> The latter quality assurance mechanism is used in many universities,
|
||||
where for organizational reasons the execution of an exam takes place in facilities
|
||||
where the author of the exam is not expected to be physically present.
|
||||
Furthermore, we assume a simple grade system (thus, some calculation is required).
|
||||
|
||||
\begin{figure}
|
||||
\begin{isar}
|
||||
Furthermore, we assume a simple grade system (thus, some calculation is required). \<close>
|
||||
text*["onto_exam"::float,
|
||||
main_caption = "''The core of the ontology modeling math exams.''"]\<open>
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class Author = ...
|
||||
datatype Subject = algebra | geometry | statistical
|
||||
datatype Grade = A1 | A2 | A3
|
||||
|
@ -459,18 +485,18 @@ doc_class Exam_item =
|
|||
concerns :: "ContentClass set"
|
||||
|
||||
type_synonym SubQuestion = string
|
||||
\end{isar}
|
||||
\caption{The core of the ontology modeling math exams.}
|
||||
\label{fig:onto-exam}
|
||||
\end{figure}
|
||||
The heart of this ontology (see \autoref{fig:onto-exam}) is an alternation of questions and answers,
|
||||
\<close>}\<close>
|
||||
|
||||
(*<*)declare_reference*[onto_questions::float](*>*)
|
||||
text\<open>The heart of this ontology (see @{float "onto_exam"}) is an alternation of questions and answers,
|
||||
where the answers can consist of simple yes-no answers (QCM style check-boxes) or lists of formulas.
|
||||
Since we do not
|
||||
assume familiarity of the students with Isabelle (\inlineisar+term+ would assume that this is a
|
||||
assume familiarity of the students with Isabelle (\<^theory_text>\<open>term\<close> would assume that this is a
|
||||
parse-able and type-checkable entity), we basically model a derivation as a sequence of strings
|
||||
(see \autoref{fig:onto-questions}).
|
||||
\begin{figure}
|
||||
\begin{isar}
|
||||
(see @{float (unchecked)"onto_questions"}).\<close>
|
||||
text*["onto_questions"::float,
|
||||
main_caption = "''An exam can contain different types of questions.''"]\<open>
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class Answer_Formal_Step = Exam_item +
|
||||
justification :: string
|
||||
"term" :: "string"
|
||||
|
@ -494,19 +520,18 @@ doc_class Exercise = Exam_item +
|
|||
content :: "(Task) list"
|
||||
concerns :: "ContentClass set" <= "UNIV"
|
||||
mark :: int
|
||||
\end{isar}
|
||||
\caption{An exam can contain different types of questions.}
|
||||
\label{fig:onto-questions}
|
||||
\end{figure}
|
||||
|
||||
\<close>}\<close>
|
||||
(*<*)declare_reference*[onto_exam_monitor::float](*>*)
|
||||
text\<open>
|
||||
In many institutions, it makes sense to have a rigorous process of validation
|
||||
for exam subjects: is the initial question correct? Is a proof in the sense of the
|
||||
question possible? We model the possibility that the @{term examiner} validates a
|
||||
question by a sample proof validated by Isabelle (see \autoref{fig:onto-exam-monitor}).
|
||||
question by a sample proof validated by Isabelle (see @{float (unchecked) "onto_exam_monitor"}).
|
||||
In our scenario this sample proofs are completely \<^emph>\<open>intern\<close>, \<^ie>, not exposed to the
|
||||
students but just additional material for the internal review process of the exam.
|
||||
\begin{figure}
|
||||
\begin{isar}
|
||||
students but just additional material for the internal review process of the exam.\<close>
|
||||
text*["onto_exam_monitor"::float,
|
||||
main_caption = "''Validating exams.''"]\<open>
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class Validation =
|
||||
tests :: "term list" <="[]"
|
||||
proofs :: "thm list" <="[]"
|
||||
|
@ -520,14 +545,9 @@ doc_class MathExam=
|
|||
content :: "(Header + Author + Exercise) list"
|
||||
global_grade :: Grade
|
||||
where "\<lbrace>Author\<rbrace>$^+$ ~~ Header ~~ \<lbrace>Exercise ~~ Solution\<rbrace>$^+$ "
|
||||
\end{isar}
|
||||
\caption{Validating exams.}
|
||||
\label{fig:onto-exam-monitor}
|
||||
\end{figure}
|
||||
\<close>
|
||||
|
||||
\<close>}\<close>
|
||||
|
||||
declare_reference*["fig_qcm"::figure]
|
||||
(*<*)declare_reference*["fig_qcm"::figure](*>*)
|
||||
|
||||
text\<open> Using the \<^LaTeX> package hyperref, it is possible to conceive an interactive
|
||||
exam-sheets with multiple-choice and/or free-response elements
|
||||
|
@ -535,14 +555,14 @@ exam-sheets with multiple-choice and/or free-response elements
|
|||
help of the latter, it is possible that students write in a browser a formal mathematical
|
||||
derivation---as part of an algebra exercise, for example---which is submitted to the examiners
|
||||
electronically. \<close>
|
||||
figure*[fig_qcm::figure,spawn_columns=False,
|
||||
relative_width="90",src="''figures/InteractiveMathSheet''"]
|
||||
\<open> A Generated QCM Fragment \<^dots> \<close>
|
||||
figure*[fig_qcm::figure,
|
||||
relative_width="90",file_src="''figures/InteractiveMathSheet.png''"]
|
||||
\<open>A Generated QCM Fragment \<^dots> \<close>
|
||||
|
||||
subsection*[cenelec_onto::example]\<open> The Certification Scenario following CENELEC \<close>
|
||||
text\<open> Documents to be provided in formal certifications (such as CENELEC
|
||||
50126/50128, the DO-178B/C, or Common Criteria) can much profit from the control of ontological consistency:
|
||||
a lot of an evaluators work consists in tracing down the links from requirements over
|
||||
50126/50128, the DO-178B/C, or Common Criteria) can much profit from the control of ontological
|
||||
consistency: a lot of an evaluators work consists in tracing down the links from requirements over
|
||||
assumptions down to elements of evidence, be it in the models, the code, or the tests.
|
||||
In a certification process, traceability becomes a major concern; and providing
|
||||
mechanisms to ensure complete traceability already at the development of the
|
||||
|
@ -554,15 +574,17 @@ of developments targeting certifications. Continuously checking the links betwee
|
|||
and the semi-formal parts of such documents is particularly valuable during the (usually
|
||||
collaborative) development effort.
|
||||
|
||||
As in many other cases, formal certification documents come with an own terminology and
|
||||
pragmatics of what has to be demonstrated and where, and how the trace-ability of requirements through
|
||||
As in many other cases, formal certification documents come with an own terminology and pragmatics
|
||||
of what has to be demonstrated and where, and how the trace-ability of requirements through
|
||||
design-models over code to system environment assumptions has to be assured.
|
||||
\<close>
|
||||
(*<*)declare_reference*["conceptual"::float](*>*)
|
||||
text\<open> In the sequel, we present a simplified version of an ontological model used in a
|
||||
case-study~ @{cite "bezzecchi.ea:making:2018"}. We start with an introduction of the concept of requirement
|
||||
(see \autoref{fig:conceptual}).
|
||||
\begin{figure}
|
||||
\begin{isar}
|
||||
(see @{float (unchecked) "conceptual"}). \<close>
|
||||
text*["conceptual"::float,
|
||||
main_caption = "''Modeling requirements.''"]\<open>
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class requirement = long_name :: "string option"
|
||||
|
||||
doc_class requirement_analysis = no :: "nat"
|
||||
|
@ -575,11 +597,9 @@ datatype ass_kind = informal | semiformal | formal
|
|||
|
||||
doc_class assumption = requirement +
|
||||
assumption_kind :: ass_kind <= informal
|
||||
\end{isar}
|
||||
\caption{Modeling requirements.}
|
||||
\label{fig:conceptual}
|
||||
\end{figure}
|
||||
Such ontologies can be enriched by larger explanations and examples, which may help
|
||||
\<close>}\<close>
|
||||
|
||||
text\<open>Such ontologies can be enriched by larger explanations and examples, which may help
|
||||
the team of engineers substantially when developing the central document for a certification,
|
||||
like an explication what is precisely the difference between an \<^emph>\<open>hypothesis\<close> and an
|
||||
\<^emph>\<open>assumption\<close> in the context of the evaluation standard. Since the PIDE makes for each
|
||||
|
@ -601,71 +621,70 @@ is the category \<^emph>\<open>safety related application condition\<close> (or
|
|||
for short) which is used for \<^emph>\<open>ec\<close>'s that establish safety properties
|
||||
of the evaluation target. Their track-ability throughout the certification
|
||||
is therefore particularly critical. This is naturally modeled as follows:
|
||||
\begin{isar}
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class ec = assumption +
|
||||
assumption_kind :: ass_kind <= (*default *) formal
|
||||
|
||||
doc_class srac = ec +
|
||||
assumption_kind :: ass_kind <= (*default *) formal
|
||||
\end{isar}
|
||||
\<close>}
|
||||
\<close>
|
||||
|
||||
section*[ontopide::technical]\<open> Ontology-based IDE support \<close>
|
||||
text\<open> We present a selection of interaction scenarios @{example \<open>scholar_onto\<close>}
|
||||
and @{example \<open>cenelec_onto\<close>} with Isabelle/PIDE instrumented by \<^isadof>. \<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["text_elements"::float]
|
||||
declare_reference*["hyperlinks"::float]
|
||||
(*>*)
|
||||
|
||||
subsection*[scholar_pide::example]\<open> A Scholarly Paper \<close>
|
||||
text\<open> In \autoref{fig-Dogfood-II-bgnd1} and \autoref{fig-bgnd-text_section} we show how
|
||||
text\<open> In @{float (unchecked) "text_elements"}~(a)
|
||||
and @{float (unchecked) "text_elements"}~(b)we show how
|
||||
hovering over links permits to explore its meta-information.
|
||||
Clicking on a document class identifier permits to hyperlink into the corresponding
|
||||
class definition (\autoref{fig:Dogfood-IV-jumpInDocCLass}); hovering over an attribute-definition
|
||||
(which is qualified in order to disambiguate; \autoref{fig:Dogfood-V-attribute}).
|
||||
class definition (@{float (unchecked) "hyperlinks"}~(a)); hovering over an attribute-definition
|
||||
(which is qualified in order to disambiguate; @{float (unchecked) "hyperlinks"}~(b)).
|
||||
\<close>
|
||||
|
||||
side_by_side_figure*["text-elements"::side_by_side_figure,anchor="''fig-Dogfood-II-bgnd1''",
|
||||
caption="''Exploring a Reference of a Text-Element.''",relative_width="48",
|
||||
src="''figures/Dogfood-II-bgnd1''",anchor2="''fig-bgnd-text_section''",
|
||||
caption2="''Exploring the class of a text element.''",relative_width2="47",
|
||||
src2="''figures/Dogfood-III-bgnd-text_section''"]\<open> Exploring text elements. \<close>
|
||||
|
||||
side_by_side_figure*["hyperlinks"::side_by_side_figure,anchor="''fig:Dogfood-IV-jumpInDocCLass''",
|
||||
caption="''Hyperlink to Class-Definition.''",relative_width="48",
|
||||
src="''figures/Dogfood-IV-jumpInDocCLass''",anchor2="''fig:Dogfood-V-attribute''",
|
||||
caption2="''Exploring an attribute.''",relative_width2="47",
|
||||
src2="''figures/Dogfood-III-bgnd-text_section''"]\<open> Hyperlinks.\<close>
|
||||
text*["text_elements"::float,
|
||||
main_caption="\<open>Exploring text elements.\<close>"]
|
||||
\<open>
|
||||
@{fig_content (width=53, height=5, caption="Exploring a reference of a text element.") "figures/Dogfood-II-bgnd1.png"
|
||||
}\<^hfill>@{fig_content (width=47, height=5, caption="Exploring the class of a text element.") "figures/Dogfood-III-bgnd-text_section.png"}
|
||||
\<close>
|
||||
|
||||
text*["hyperlinks"::float,
|
||||
main_caption="\<open>Hyperlinks.\<close>"]
|
||||
\<open>
|
||||
@{fig_content (width=48, caption="Hyperlink to Class-Definition.") "figures/Dogfood-IV-jumpInDocCLass.png"
|
||||
}\<^hfill>@{fig_content (width=47, caption="Exploring an attribute.") "figures/Dogfood-V-attribute.png"}
|
||||
\<close>
|
||||
|
||||
declare_reference*["figDogfoodVIlinkappl"::figure]
|
||||
text\<open> An ontological reference application in \autoref{figDogfoodVIlinkappl}: the ontology-dependant
|
||||
antiquotation \inlineisar|@ {example ...}| refers to the corresponding text-elements. Hovering allows
|
||||
for inspection, clicking for jumping to the definition. If the link does not exist or has a
|
||||
non-compatible type, the text is not validated. \<close>
|
||||
|
||||
figure*[figDogfoodVIlinkappl::figure,relative_width="80",src="''figures/Dogfood-V-attribute''"]
|
||||
\<open> Exploring an attribute (hyperlinked to the class). \<close>
|
||||
subsection*[cenelec_pide::example]\<open> CENELEC \<close>
|
||||
declare_reference*[figfig3::figure]
|
||||
text\<open> The corresponding view in @{docitem (unchecked) \<open>figfig3\<close>} shows core part of a document,
|
||||
(*<*)declare_reference*[figfig3::figure](*>*)
|
||||
text\<open> The corresponding view in @{figure (unchecked) \<open>figfig3\<close>} shows core part of a document,
|
||||
coherent to the @{example \<open>cenelec_onto\<close>}. The first sample shows standard Isabelle antiquotations
|
||||
@{cite "wenzel:isabelle-isar:2017"} into formal entities of a theory. This way, the informal parts
|
||||
of a document get ``formal content'' and become more robust under change.\<close>
|
||||
|
||||
figure*[figfig3::figure,relative_width="80",src="''figures/antiquotations-PIDE''"]
|
||||
figure*[figfig3::figure,relative_width="80",file_src="''figures/antiquotations-PIDE.png''"]
|
||||
\<open> Standard antiquotations referring to theory elements.\<close>
|
||||
|
||||
declare_reference*[figfig5::figure]
|
||||
(*<*)declare_reference*[figfig5::figure] (*>*)
|
||||
text\<open> The subsequent sample in @{figure (unchecked) \<open>figfig5\<close>} shows the definition of an
|
||||
\<^emph>\<open>safety-related application condition\<close>, a side-condition of a theorem which
|
||||
has the consequence that a certain calculation must be executed sufficiently fast on an embedded
|
||||
device. This condition can not be established inside the formal theory but has to be
|
||||
checked by system integration tests.\<close>
|
||||
|
||||
figure*[figfig5::figure, relative_width="80", src="''figures/srac-definition''"]
|
||||
figure*[figfig5::figure, relative_width="80", file_src="''figures/srac-definition.png''"]
|
||||
\<open> Defining a SRAC reference \<^dots> \<close>
|
||||
figure*[figfig7::figure, relative_width="80", src="''figures/srac-as-es-application''"]
|
||||
figure*[figfig7::figure, relative_width="80", file_src="''figures/srac-as-es-application.png''"]
|
||||
\<open> Using a SRAC as EC document reference. \<close>
|
||||
|
||||
text\<open> Now we reference in @{figure (unchecked) \<open>figfig7\<close>} this safety-related condition;
|
||||
text\<open> Now we reference in @{figure \<open>figfig7\<close>} this safety-related condition;
|
||||
however, this happens in a context where general \<^emph>\<open>exported constraints\<close> are listed.
|
||||
\<^isadof>'s checks establish that this is legal in the given ontology.
|
||||
|
||||
|
@ -677,7 +696,7 @@ informal parts. \<close>
|
|||
section*[onto_future::technical]\<open> Monitor Classes \<close>
|
||||
text\<open> Besides sub-typing, there is another relation between
|
||||
document classes: a class can be a \<^emph>\<open>monitor\<close> to other ones,
|
||||
which is expressed by the occurrence of a \inlineisar+where+ clause
|
||||
which is expressed by the occurrence of a @{theory_text \<open>where\<close>} clause
|
||||
in the document class definition containing a regular
|
||||
expression (see @{example \<open>scholar_onto\<close>}).
|
||||
While class-extension refers to data-inheritance of attributes,
|
||||
|
@ -686,8 +705,8 @@ in which instances of monitored classes may occur. \<close>
|
|||
|
||||
text\<open>
|
||||
The control of monitors is done by the commands:
|
||||
\<^item> \inlineisar+open_monitor* + <doc-class>
|
||||
\<^item> \inlineisar+close_monitor* + <doc-class>
|
||||
\<^item> \<^theory_text>\<open>open_monitor*\<close> \<^emph>\<open><doc-class>\<close>
|
||||
\<^item> \<^theory_text>\<open>close_monitor*\<close> \<^emph>\<open><doc-class>\<close>
|
||||
\<close>
|
||||
text\<open>
|
||||
where the automaton of the monitor class is expected to be in a final state. In the final state,
|
||||
|
@ -735,8 +754,7 @@ work in this area we are aware of is rOntorium~@{cite "rontorium"}, a plugin
|
|||
for \<^Protege> that integrates R~@{cite "adler:r:2010"} into an
|
||||
ontology environment. Here, the main motivation behind this
|
||||
integration is to allow for statistically analyze ontological
|
||||
documents. Thus, this is complementary to our work.
|
||||
\<close>
|
||||
documents. Thus, this is complementary to our work.\<close>
|
||||
|
||||
text\<open> \<^isadof> in its present form has a number of technical short-comings as well
|
||||
as potentials not yet explored. On the long list of the short-comings is the
|
|
@ -1,15 +1,14 @@
|
|||
session "2018-cicm-isabelle_dof-applications" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof,
|
||||
dof_ontologies = "Isabelle_DOF.scholarly_paper", dof_template = Isabelle_DOF.lncs,
|
||||
quick_and_dirty = true]
|
||||
chapter AFP
|
||||
|
||||
session "Isabelle_DOF-Example-I" (AFP) = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
|
||||
theories
|
||||
IsaDofApplications
|
||||
document_files
|
||||
"root.bib"
|
||||
"authorarchive.sty"
|
||||
"preamble.tex"
|
||||
"lstisadof.sty"
|
||||
"vector_iD_icon.pdf"
|
||||
"lstisadof-manual.sty"
|
||||
"figures/isabelle-architecture.pdf"
|
||||
"figures/Dogfood-Intro.png"
|
||||
"figures/InteractiveMathSheet.png"
|
|
@ -1,4 +1,4 @@
|
|||
%% Copyright (C) 2008-2019 Achim D. Brucker, https://www.brucker.ch
|
||||
%% Copyright (C) 2008-2023 Achim D. Brucker, https://www.brucker.ch
|
||||
%%
|
||||
%% License:
|
||||
%% This program can be redistributed and/or modified under the terms
|
||||
|
@ -11,21 +11,22 @@
|
|||
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
|
||||
\NeedsTeXFormat{LaTeX2e}\relax
|
||||
\ProvidesPackage{authorarchive}
|
||||
[0000/00/00 Unreleased v1.1.1+%
|
||||
[2023/02/10 v1.3.0
|
||||
Self-archiving information for scientific publications.]
|
||||
%
|
||||
\PassOptionsToPackage{hyphens}{url}
|
||||
%
|
||||
\RequirePackage{ifthen}
|
||||
\RequirePackage[inline]{enumitem}
|
||||
\RequirePackage{graphicx}
|
||||
\RequirePackage{orcidlink}
|
||||
\RequirePackage{eso-pic}
|
||||
\RequirePackage{intopdf}
|
||||
\RequirePackage{kvoptions}
|
||||
\RequirePackage{hyperref}
|
||||
\RequirePackage{calc}
|
||||
\RequirePackage{qrcode}
|
||||
\RequirePackage{hvlogos}
|
||||
\RequirePackage{etoolbox}
|
||||
\newrobustcmd\BibTeX{Bib\TeX}
|
||||
%
|
||||
%Better url breaking
|
||||
\g@addto@macro{\UrlBreaks}{\UrlOrds}
|
||||
|
@ -80,31 +81,51 @@
|
|||
}
|
||||
\ProcessKeyvalOptions*
|
||||
|
||||
% Provide command for dynamic configuration seutp
|
||||
\def\authorsetup{\kvsetkeys{AA}}
|
||||
\newcommand{\AA@defIncludeFiles}{
|
||||
\def\AA@bibBibTeX{\AA@bibtexdir/\AA@key.bib}
|
||||
\def\AA@bibBibTeXLong{\AA@bibtexdir/\AA@key.bibtex}
|
||||
\def\AA@bibWord{\AA@bibtexdir/\AA@key.word.xml}
|
||||
\def\AA@bibEndnote{\AA@bibtexdir/\AA@key.enw}
|
||||
\def\AA@bibRIS{\AA@bibtexdir/\AA@key.ris}
|
||||
}
|
||||
\AA@defIncludeFiles
|
||||
|
||||
\newboolean{AA@bibExists}
|
||||
\setboolean{AA@bibExists}{false}
|
||||
\newcommand{\AA@defIncludeSwitches}{
|
||||
\IfFileExists{\AA@bibBibTeX}{\setboolean{AA@bibExists}{true}}{}
|
||||
\IfFileExists{\AA@bibBibTeXLong}{\setboolean{AA@bibExists}{true}}{}
|
||||
\IfFileExists{\AA@bibWord}{\setboolean{AA@bibExists}{true}}{}
|
||||
\IfFileExists{\AA@bibEndnote}{\setboolean{AA@bibExists}{true}}{}
|
||||
\IfFileExists{\AA@bibRIS}{\setboolean{AA@bibExists}{true}}{}
|
||||
}
|
||||
\AA@defIncludeSwitches
|
||||
|
||||
|
||||
% Provide command for dynamic configuration setup
|
||||
% \def\authorsetup{\kvsetkeys{AA}}
|
||||
\newcommand{\authorsetup}[1]{%
|
||||
\kvsetkeys{AA}{#1}
|
||||
\AA@defIncludeFiles
|
||||
\AA@defIncludeSwitches
|
||||
}
|
||||
|
||||
% Load local configuration
|
||||
\InputIfFileExists{authorarchive.config}{}{}
|
||||
|
||||
% define proxy command for setting PDF attributes
|
||||
\ExplSyntaxOn
|
||||
\@ifundefined{pdfmanagement_add:nnn}{%
|
||||
\newcommand{\AA@pdfpagesattribute}[2]{\pdfpagesattr{/#1 #2}}%
|
||||
}{%
|
||||
\newcommand{\AA@pdfpagesattribute}[2]{\pdfmanagement_add:nnn{Pages}{#1}{#2}}%
|
||||
}%
|
||||
\ExplSyntaxOff
|
||||
|
||||
\newlength\AA@x
|
||||
\newlength\AA@y
|
||||
\newlength\AA@width
|
||||
|
||||
\def\AA@bibBibTeX{\AA@bibtexdir/\AA@key.bib}
|
||||
\def\AA@bibBibTeXLong{\AA@bibtexdir/\AA@key.bibtex}
|
||||
\def\AA@bibWord{\AA@bibtexdir/\AA@key.word.xml}
|
||||
\def\AA@bibEndnote{\AA@bibtexdir/\AA@key.enw}
|
||||
\def\AA@bibRIS{\AA@bibtexdir/\AA@key.ris}
|
||||
|
||||
\newboolean{AA@bibExists}
|
||||
\setboolean{AA@bibExists}{false}
|
||||
\IfFileExists{\AA@bibBibTeX}{\setboolean{AA@bibExists}{true}}{}
|
||||
\IfFileExists{\AA@bibBibTeXLong}{\setboolean{AA@bibExists}{true}}{}
|
||||
\IfFileExists{\AA@bibWord}{\setboolean{AA@bibExists}{true}}{}
|
||||
\IfFileExists{\AA@bibEndnote}{\setboolean{AA@bibExists}{true}}{}
|
||||
\IfFileExists{\AA@bibRIS}{\setboolean{AA@bibExists}{true}}{}
|
||||
|
||||
\setlength\AA@x{1in+\hoffset+\oddsidemargin}
|
||||
|
||||
\newcommand{\authorcrfont}{\footnotesize}
|
||||
|
@ -148,8 +169,7 @@
|
|||
%%%% LNCS
|
||||
\ifAA@LNCS%
|
||||
\ifAA@orcidicon%
|
||||
\renewcommand{\orcidID}[1]{\href{https://orcid.org/#1}{%
|
||||
\textsuperscript{\,\includegraphics[height=2\fontcharht\font`A]{vector_iD_icon}}}}
|
||||
\renewcommand{\orcidID}[1]{\orcidlink{#1}}
|
||||
\else\relax\fi%
|
||||
%
|
||||
\ifthenelse{\equal{\AA@publisher}{UNKNOWN PUBLISHER}}{%
|
||||
|
@ -157,23 +177,11 @@
|
|||
}{}
|
||||
\renewcommand{\authorcrfont}{\scriptsize}
|
||||
\@ifclasswith{llncs}{a4paper}{%
|
||||
\ExplSyntaxOn
|
||||
\@ifundefined{pdfmanagement_add:nnn}{%
|
||||
\pdfpagesattr{/CropBox [92 114 523 780]}%
|
||||
}{%
|
||||
\pdfmanagement_add:nnn {Pages}{CropBox}{[92~114~523~780]}
|
||||
}%
|
||||
\ExplSyntaxOff
|
||||
\AA@pdfpagesattribute{CropBox}{[92 114 523 780]}%
|
||||
\renewcommand{\authorat}[1]{\put(\LenToUnit{\AA@x},40){#1}}%
|
||||
}{%
|
||||
\ExplSyntaxOn
|
||||
\@ifundefined{pdfmanagement_add:nnn}{%
|
||||
\pdfpagesattr{/CropBox [92 65 523 731]}% LNCS page: 152x235 mm
|
||||
}{%
|
||||
\pdfmanagement_add:nnn {Pages}{CropBox}{[92~62~523~731]}
|
||||
}%
|
||||
\ExplSyntaxOff
|
||||
\renewcommand{\authorat}[1]{\put(\LenToUnit{\AA@x},23){#1}}
|
||||
\AA@pdfpagesattribute{CropBox}{[92 65 523 731]}%
|
||||
\renewcommand{\authorat}[1]{\put(\LenToUnit{\AA@x},23){#1}}%
|
||||
}
|
||||
\setlength{\AA@width}{\textwidth}
|
||||
\setcounter{tocdepth}{2}
|
||||
|
@ -186,7 +194,7 @@
|
|||
}{}
|
||||
\renewcommand{\authorat}[1]{\put(\LenToUnit{\AA@x},35){#1}}
|
||||
\renewcommand{\authorcrfont}{\scriptsize}
|
||||
\pdfpagesattr{/CropBox [70 65 526.378 748.15]} % TODO
|
||||
\AA@pdfpagesattribute{CropBox}{[70 65 526.378 748.15]}
|
||||
\setlength{\AA@width}{\textwidth}
|
||||
\setcounter{tocdepth}{2}
|
||||
\fi
|
||||
|
@ -218,8 +226,6 @@
|
|||
draft = false,
|
||||
bookmarksopen = true,
|
||||
bookmarksnumbered= true,
|
||||
pdfauthor = {\@author},
|
||||
pdftitle = {\@title},
|
||||
}
|
||||
|
||||
\@ifpackageloaded{totpages}{%
|
||||
|
@ -305,26 +311,26 @@
|
|||
\hfill
|
||||
\begin{itemize*}[label={}, itemjoin={,}]
|
||||
\IfFileExists{\AA@bibBibTeX}{%
|
||||
\item \attachandlink{\AA@bibBibTeX}[application/x-bibtex]{BibTeX entry of this paper}{\BibTeX}%
|
||||
\item \expanded{\attachandlink[\AA@key.bib]{\AA@bibBibTeX}[application/x-bibtex]{BibTeX entry of this paper}{\BibTeX}}%
|
||||
}{%
|
||||
\IfFileExists{\AA@bibBibTeXLong}{%
|
||||
\item \attachandlink[\AA@key.bib]{\AA@bibBibTeXLong}[application/x-bibtex]{BibTeX entry of this paper}{\BibTeX}%
|
||||
\item \expanded{\attachandlink[\AA@key.bib]{\AA@bibBibTeXLong}[application/x-bibtex]{BibTeX entry of this paper}{\BibTeX}}%
|
||||
}{%
|
||||
\typeout{No file \AA@bibBibTeX{} (and no \AA@bibBibTeXLong) found. Not embedded reference in BibTeX format.}%
|
||||
}%
|
||||
}%
|
||||
\IfFileExists{\AA@bibWord}{%
|
||||
\item \attachandlink{\AA@bibWord}[application/xml]{XML entry of this paper (e.g., for Word 2007 and later)}{Word}%
|
||||
\item \expanded{\attachandlink[\AA@key.word.xml]{\AA@bibWord}[application/xml]{XML entry of this paper (e.g., for Word 2007 and later)}{Word}}%
|
||||
}{%
|
||||
\typeout{No file \AA@bibWord{} found. Not embedded reference for Word 2007 and later.}%
|
||||
}%
|
||||
\IfFileExists{\AA@bibEndnote}{%
|
||||
\item \attachandlink{\AA@bibEndnote}[application/x-endnote-refer]{Endnote entry of this paper}{EndNote}%
|
||||
\item \expanded{\attachandlink[\AA@key.enw]{\AA@bibEndnote}[application/x-endnote-refer]{Endnote entry of this paper}{EndNote}}%
|
||||
}{%
|
||||
\typeout{No file \AA@bibEndnote{} found. Not embedded reference in Endnote format.}%
|
||||
}%
|
||||
\IfFileExists{\AA@bibRIS}{%
|
||||
\item \attachandlink{\AA@bibRIS}[application/x-research-info-systems]{RIS entry of this paper}{RIS}%
|
||||
\item \expanded{\attachandlink[\AA@key.ris]{\AA@bibRIS}[application/x-research-info-systems]{RIS entry of this paper}{RIS}}%
|
||||
}{%
|
||||
\typeout{No file \AA@bibRIS{} found. Not embedded reference in RIS format.}%
|
||||
}%
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 14 KiB |
Before Width: | Height: | Size: 18 KiB After Width: | Height: | Size: 18 KiB |
Before Width: | Height: | Size: 23 KiB After Width: | Height: | Size: 23 KiB |
Before Width: | Height: | Size: 85 KiB After Width: | Height: | Size: 85 KiB |
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 16 KiB |
Before Width: | Height: | Size: 18 KiB After Width: | Height: | Size: 18 KiB |
Before Width: | Height: | Size: 75 KiB After Width: | Height: | Size: 75 KiB |
Before Width: | Height: | Size: 96 KiB After Width: | Height: | Size: 96 KiB |
Before Width: | Height: | Size: 57 KiB After Width: | Height: | Size: 57 KiB |
Before Width: | Height: | Size: 67 KiB After Width: | Height: | Size: 67 KiB |
Before Width: | Height: | Size: 50 KiB After Width: | Height: | Size: 50 KiB |
|
@ -90,9 +90,7 @@
|
|||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{isar!60!black}
|
||||
,sharp corners
|
||||
,before skip balanced=0.5\baselineskip plus 2pt
|
||||
% ,before skip=10pt
|
||||
% ,after skip=10pt
|
||||
%,before skip balanced=0.5\baselineskip plus 2pt % works only with Tex Live 2020 and later
|
||||
,enlarge top by=0mm
|
||||
,enhanced
|
||||
,overlay={\node[draw,fill=isar!60!black,xshift=0pt,anchor=north
|
||||
|
@ -136,11 +134,12 @@
|
|||
\lstloadlanguages{ML}
|
||||
\providecolor{sml}{named}{red}
|
||||
\lstdefinestyle{sml}{
|
||||
basicstyle=\ttfamily,%
|
||||
commentstyle=\itshape,%
|
||||
keywordstyle=\bfseries\color{CornflowerBlue},%
|
||||
ndkeywordstyle=\color{green},%
|
||||
language=ML
|
||||
,escapechar=ë%
|
||||
,basicstyle=\ttfamily%
|
||||
,commentstyle=\itshape%
|
||||
,keywordstyle=\bfseries\color{CornflowerBlue}%
|
||||
,ndkeywordstyle=\color{green}%
|
||||
,language=ML
|
||||
% ,literate={%
|
||||
% {<@>}{@}1%
|
||||
% }
|
||||
|
@ -150,7 +149,7 @@
|
|||
,tagstyle=\color{CornflowerBlue}%
|
||||
,markfirstintag=true%
|
||||
}%
|
||||
\def\inlinesml{\lstinline[style=sml,breaklines=true,mathescape,breakatwhitespace=true]}
|
||||
\def\inlinesml{\lstinline[style=sml,breaklines=true,breakatwhitespace=true]}
|
||||
\newtcblisting{sml}[1][]{%
|
||||
listing only%
|
||||
,boxrule=0pt
|
||||
|
@ -170,7 +169,6 @@
|
|||
style=sml
|
||||
,columns=flexible%
|
||||
,basicstyle=\small\ttfamily
|
||||
,mathescape
|
||||
,#1
|
||||
}
|
||||
}%
|
||||
|
@ -296,3 +294,34 @@
|
|||
}%
|
||||
%% </bash>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
%% <config>
|
||||
\providecolor{config}{named}{gray}
|
||||
\newtcblisting{config}[2][]{%
|
||||
listing only%
|
||||
,boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback=white!90!config
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{config!60!black}
|
||||
,sharp corners
|
||||
% ,before skip=10pt
|
||||
% ,after skip=10pt
|
||||
,enlarge top by=0mm
|
||||
,enhanced
|
||||
,overlay={\node[draw,fill=config!60!black,xshift=0pt,anchor=north
|
||||
east,font=\bfseries\footnotesize\color{white}]
|
||||
at (frame.north east) {#2};}
|
||||
,listing options={
|
||||
breakatwhitespace=true
|
||||
,columns=flexible%
|
||||
,basicstyle=\small\ttfamily
|
||||
,mathescape
|
||||
,#1
|
||||
}
|
||||
}%
|
||||
%% </config>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
|
|
@ -20,39 +20,9 @@
|
|||
\usepackage{xcolor}
|
||||
\usepackage{paralist}
|
||||
\usepackage{listings}
|
||||
\usepackage{lstisadof}
|
||||
\usepackage{xspace}
|
||||
|
||||
\lstloadlanguages{bash}
|
||||
\lstdefinestyle{bash}{language=bash,
|
||||
,basicstyle=\ttfamily%
|
||||
,showspaces=false%
|
||||
,showlines=false%
|
||||
,columns=flexible%
|
||||
% ,keywordstyle=\bfseries%
|
||||
% Defining 2-keywords
|
||||
,keywordstyle=[1]{\color{BrickRed!60}\bfseries}%
|
||||
% Defining 3-keywords
|
||||
,keywordstyle=[2]{\color{OliveGreen!60}\bfseries}%
|
||||
% Defining 4-keywords
|
||||
,keywordstyle=[3]{\color{black!60}\bfseries}%
|
||||
% Defining 5-keywords
|
||||
,keywordstyle=[4]{\color{Blue!70}\bfseries}%
|
||||
% Defining 6-keywords
|
||||
,keywordstyle=[5]{\itshape}%
|
||||
%
|
||||
}
|
||||
\lstdefinestyle{displaybash}{style=bash,
|
||||
basicstyle=\ttfamily\footnotesize,
|
||||
backgroundcolor=\color{black!2}, frame=lines}%
|
||||
|
||||
\lstnewenvironment{bash}[1][]{\lstset{style=displaybash, #1}}{}
|
||||
\def\inlinebash{\lstinline[style=bash, breaklines=true,columns=fullflexible]}
|
||||
|
||||
\usepackage[caption]{subfig}
|
||||
\usepackage[size=footnotesize]{caption}
|
||||
|
||||
\usepackage{lstisadof-manual}
|
||||
|
||||
\providecommand{\isactrlemph}[1]{\emph{#1}}
|
||||
\usepackage[LNCS,
|
||||
orcidicon,
|
||||
key=brucker.ea-isabelle-ontologies-2018,
|
|
@ -0,0 +1,9 @@
|
|||
chapter AFP
|
||||
|
||||
session "Isabelle_DOF-Example-II" (AFP) = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
|
||||
theories
|
||||
"paper"
|
||||
document_files
|
||||
"root.bib"
|
||||
"preamble.tex"
|
|
@ -1,6 +1,8 @@
|
|||
%% This is a placeholder for user-specific configuration and packages.
|
||||
|
||||
\usepackage{stmaryrd}
|
||||
\usepackage{pifont}% http://ctan.org/pkg/pifont
|
||||
|
||||
|
||||
\title{<TITLE>}
|
||||
\author{<AUTHOR>}
|
|
@ -6870,7 +6870,7 @@ isbn="978-3-540-48509-4"
|
|||
title = {{Isabelle's} Logic: {HOL}},
|
||||
author = {Tobias Nipkow and Lawrence C. Paulson and Markus Wenzel},
|
||||
year = 2009,
|
||||
misc = {\url{http://isabelle.in.tum.de/library/HOL/}}
|
||||
misc = {\url{https://isabelle.in.tum.de/library/HOL/}}
|
||||
}
|
||||
|
||||
@InProceedings{ garson.ea:security:2008,
|
||||
|
@ -11000,7 +11000,7 @@ isbn="978-1-4471-3182-3"
|
|||
journal = {Archive of Formal Proofs},
|
||||
month = apr,
|
||||
year = 2019,
|
||||
note = {\url{http://isa-afp.org/entries/HOL-CSP.html}},
|
||||
note = {\url{https://isa-afp.org/entries/HOL-CSP.html}},
|
||||
ISSN = {2150-914x},
|
||||
}
|
||||
|
|
@ -3,6 +3,8 @@ theory "paper"
|
|||
imports "Isabelle_DOF.scholarly_paper"
|
||||
begin
|
||||
|
||||
use_template "scrartcl"
|
||||
use_ontology "scholarly_paper"
|
||||
|
||||
open_monitor*[this::article]
|
||||
|
||||
|
@ -10,10 +12,13 @@ declare[[ strict_monitor_checking = false]]
|
|||
declare[[ Definition_default_class = "definition"]]
|
||||
declare[[ Lemma_default_class = "lemma"]]
|
||||
declare[[ Theorem_default_class = "theorem"]]
|
||||
declare[[ Corollary_default_class = "corollary"]]
|
||||
|
||||
define_shortcut* csp \<rightleftharpoons> \<open>CSP\<close>
|
||||
holcsp \<rightleftharpoons> \<open>HOL-CSP\<close>
|
||||
isabelle \<rightleftharpoons> \<open>Isabelle/HOL\<close>
|
||||
hfill \<rightleftharpoons> \<open>\hfill\<close>
|
||||
br \<rightleftharpoons> \<open>\break\<close>
|
||||
|
||||
(*>*)
|
||||
|
||||
|
@ -25,7 +30,7 @@ author*[lina,email="\<open>lina.ye@lri.fr\<close>",affiliation="\<open>LRI, Inri
|
|||
|
||||
abstract*[abs, keywordlist="[\<open>Shallow Embedding\<close>,\<open>Process-Algebra\<close>,
|
||||
\<open>Concurrency\<close>,\<open>Computational Models\<close>]"]
|
||||
\<open> The theory of Communicating Sequential Processes going back to Hoare and Roscoe is still today
|
||||
\<open> The theory of Communicating Sequential Processes going back to Hoare and Roscoe is still today
|
||||
one of the reference theories for concurrent specification and computing. In 1997, a first
|
||||
formalization in \<^isabelle> of the denotational semantics of the Failure/Divergence Model of
|
||||
\<^csp> was undertaken; in particular, this model can cope with infinite alphabets, in contrast
|
||||
|
@ -49,33 +54,32 @@ abstract*[abs, keywordlist="[\<open>Shallow Embedding\<close>,\<open>Process-Alg
|
|||
If you consider citing this paper, please refer to @{cite "HOL-CSP-iFM2020"}.
|
||||
\<close>
|
||||
text\<open>\<close>
|
||||
section*[introheader::introduction,main_author="Some(@{docitem ''bu''}::author)"]\<open> Introduction \<close>
|
||||
text*[introtext::introduction]\<open>
|
||||
Communicating Sequential Processes (\<^csp>) is a language
|
||||
to specify and verify patterns of interaction of concurrent systems.
|
||||
Together with CCS and LOTOS, it belongs to the family of \<^emph>\<open>process algebras\<close>.
|
||||
\<^csp>'s rich theory comprises denotational, operational and algebraic semantic facets
|
||||
and has influenced programming languages such as Limbo, Crystal, Clojure and
|
||||
most notably Golang @{cite "donovan2015go"}. \<^csp> has been applied in
|
||||
industry as a tool for specifying and verifying the concurrent aspects of hardware
|
||||
systems, such as the T9000 transansputer @{cite "Barret95"}.
|
||||
section*[introheader::introduction,main_author="Some(@{author ''bu''}::author)"]\<open> Introduction \<close>
|
||||
text*[introtext::introduction, level="Some 1"]\<open>
|
||||
Communicating Sequential Processes (\<^csp>) is a language to specify and verify patterns of
|
||||
interaction of concurrent systems. Together with CCS and LOTOS, it belongs to the family of
|
||||
\<^emph>\<open>process algebras\<close>. \<^csp>'s rich theory comprises denotational, operational and algebraic semantic
|
||||
facets and has influenced programming languages such as Limbo, Crystal, Clojure and most notably
|
||||
Golang @{cite "donovan2015go"}. \<^csp> has been applied in industry as a tool for specifying and
|
||||
verifying the concurrent aspects of hardware systems, such as the T9000 transansputer
|
||||
@{cite "Barret95"}.
|
||||
|
||||
The theory of \<^csp> was first described in 1978 in a book by Tony Hoare @{cite "Hoare:1985:CSP:3921"},
|
||||
but has since evolved substantially @{cite "BrookesHR84" and "brookes-roscoe85" and "roscoe:csp:1998"}.
|
||||
\<^csp> describes the most common communication and synchronization mechanisms
|
||||
with one single language primitive: synchronous communication written \<open>_\<lbrakk>_\<rbrakk>_\<close>. \<^csp> semantics is
|
||||
described by a fully abstract model of behaviour designed to be \<^emph>\<open>compositional\<close>: the denotational
|
||||
semantics of a process \<open>P\<close> encompasses all possible behaviours of this process in the context of all
|
||||
possible environments \<open>P \<lbrakk>S\<rbrakk> Env\<close> (where \<open>S\<close> is the set of \<open>atomic events\<close> both \<open>P\<close> and \<open>Env\<close> must
|
||||
synchronize). This design objective has the consequence that two kinds of choice have to
|
||||
be distinguished:
|
||||
\<^enum> the \<^emph>\<open>external choice\<close>, written \<open>_\<box>_\<close>, which forces a process "to follow" whatever
|
||||
the environment offers, and
|
||||
\<^enum> the \<^emph>\<open>internal choice\<close>, written \<open>_\<sqinter>_\<close>, which imposes on the environment of a process
|
||||
"to follow" the non-deterministic choices made.
|
||||
\<^csp> describes the most common communication and synchronization mechanisms with one single language
|
||||
primitive: synchronous communication written \<open>_\<lbrakk>_\<rbrakk>_\<close>. \<^csp> semantics is described by a fully abstract
|
||||
model of behaviour designed to be \<^emph>\<open>compositional\<close>: the denotational semantics of a process \<open>P\<close>
|
||||
encompasses all possible behaviours of this process in the context of all possible environments
|
||||
\<open>P \<lbrakk>S\<rbrakk> Env\<close> (where \<open>S\<close> is the set of \<open>atomic events\<close> both \<open>P\<close> and \<open>Env\<close> must synchronize). This
|
||||
design objective has the consequence that two kinds of choice have to be distinguished: \<^vs>\<open>0.1cm\<close>
|
||||
|
||||
\<^enum> the \<^emph>\<open>external choice\<close>, written \<open>_\<box>_\<close>, which forces a process "to follow" whatever
|
||||
the environment offers, and \<^vs>\<open>-0.4cm\<close>
|
||||
\<^enum> the \<^emph>\<open>internal choice\<close>, written \<open>_\<sqinter>_\<close>, which imposes on the environment of a process
|
||||
"to follow" the non-deterministic choices made.\<^vs>\<open>0.3cm\<close>
|
||||
\<close>
|
||||
text\<open>
|
||||
|
||||
text\<open> \<^vs>\<open>-0.6cm\<close>
|
||||
Generalizations of these two operators \<open>\<box>x\<in>A. P(x)\<close> and \<open>\<Sqinter>x\<in>A. P(x)\<close> allow for modeling the concepts
|
||||
of \<^emph>\<open>input\<close> and \<^emph>\<open>output\<close>: Based on the prefix operator \<open>a\<rightarrow>P\<close> (event \<open>a\<close> happens, then the process
|
||||
proceeds with \<open>P\<close>), receiving input is modeled by \<open>\<box>x\<in>A. x\<rightarrow>P(x)\<close> while sending output is represented
|
||||
|
@ -121,25 +125,11 @@ attempt to formalize denotational \<^csp> semantics covering a part of Bill Rosc
|
|||
\<^url>\<open>https://gitlri.lri.fr/burkhart.wolff/hol-csp2.0\<close>. In this paper, all Isabelle proofs are
|
||||
omitted.\<close>}.
|
||||
\<close>
|
||||
(*
|
||||
% Moreover, decomposition rules of the form:
|
||||
% \begin{center}
|
||||
% \begin{minipage}[c]{10cm}
|
||||
% @{cartouche [display] \<open>C \<Longrightarrow> A \<sqsubseteq>\<^sub>F\<^sub>D A' \<Longrightarrow> B \<sqsubseteq>\<^sub>F\<^sub>D B' \<Longrightarrow> A \<lbrakk>S\<rbrakk> B \<sqsubseteq>\<^sub>F\<^sub>D A' \<lbrakk>S\<rbrakk> B'\<close>}
|
||||
% \end{minipage}
|
||||
% \end{center}
|
||||
% are of particular interest since they allow to avoid the costly automata-product construction
|
||||
% of model-checkers and to separate infinite sub-systems from finite (model-checkable) ones; however,
|
||||
% their side-conditions \<open>C\<close> are particularly tricky to work out. Decomposition rules may pave the
|
||||
% way for future tool combinations for model-checkers such as FDR4~@{cite "fdr4"} or
|
||||
% PAT~@{cite "SunLDP09"} based on proof certifications.*)
|
||||
|
||||
section*["pre"::tc,main_author="Some(@{docitem \<open>bu\<close>}::author)"]
|
||||
section*["pre"::technical,main_author="Some(@{author \<open>bu\<close>}::author)"]
|
||||
\<open>Preliminaries\<close>
|
||||
|
||||
text\<open>\<close>
|
||||
|
||||
subsection*[cspsemantics::tc, main_author="Some(@{docitem ''bu''})"]\<open>Denotational \<^csp> Semantics\<close>
|
||||
subsection*[cspsemantics::technical, main_author="Some(@{author ''bu''})"]\<open>Denotational \<^csp> Semantics\<close>
|
||||
|
||||
text\<open> The denotational semantics (following @{cite "roscoe:csp:1998"}) comes in three layers:
|
||||
the \<^emph>\<open>trace model\<close>, the \<^emph>\<open>(stable) failures model\<close> and the \<^emph>\<open>failure/divergence model\<close>.
|
||||
|
@ -152,10 +142,10 @@ processes \<open>Skip\<close> (successful termination) and \<open>Stop\<close> (
|
|||
\<open>\<T>(Skip) = \<T>(Stop) = {[]}\<close>.
|
||||
Note that the trace sets, representing all \<^emph>\<open>partial\<close> history, is in general prefix closed.\<close>
|
||||
|
||||
text*[ex1::math_example, status=semiformal] \<open>
|
||||
Let two processes be defined as follows:
|
||||
text*[ex1::math_example, status=semiformal, level="Some 1"] \<open>
|
||||
Let two processes be defined as follows:\<^vs>\<open>0.2cm\<close>
|
||||
|
||||
\<^enum> \<open>P\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<box> (b \<rightarrow> Stop)\<close>
|
||||
\<^enum> \<open>P\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<box> (b \<rightarrow> Stop)\<close>
|
||||
\<^enum> \<open>P\<^sub>n\<^sub>d\<^sub>e\<^sub>t = (a \<rightarrow> Stop) \<sqinter> (b \<rightarrow> Stop)\<close>
|
||||
\<close>
|
||||
|
||||
|
@ -181,7 +171,6 @@ The following process \<open>P\<^sub>i\<^sub>n\<^sub>f\<close> is an infinite pr
|
|||
many times. However, using the \<^csp> hiding operator \<open>_\_\<close>, this activity is concealed:
|
||||
|
||||
\<^enum> \<open>P\<^sub>i\<^sub>n\<^sub>f = (\<mu> X. a \<rightarrow> X) \ {a}\<close>
|
||||
|
||||
\<close>
|
||||
|
||||
text\<open>where \<open>P\<^sub>i\<^sub>n\<^sub>f\<close> will be equivalent to \<open>\<bottom>\<close> in the process cpo ordering.
|
||||
|
@ -200,7 +189,7 @@ of @{cite "IsobeRoggenbach2010"} is restricted to a variant of the failures mode
|
|||
|
||||
\<close>
|
||||
|
||||
subsection*["isabelleHol"::tc, main_author="Some(@{docitem ''bu''})"]\<open>Isabelle/HOL\<close>
|
||||
subsection*["isabelleHol"::technical, main_author="Some(@{author ''bu''})"]\<open>Isabelle/HOL\<close>
|
||||
text\<open> Nowadays, Isabelle/HOL is one of the major interactive theory development environments
|
||||
@{cite "nipkow.ea:isabelle:2002"}. HOL stands for Higher-Order Logic, a logic based on simply-typed
|
||||
\<open>\<lambda>\<close>-calculus extended by parametric polymorphism and Haskell-like type-classes.
|
||||
|
@ -212,7 +201,6 @@ in the plethora of work done and has been a key factor for the success of the Ar
|
|||
For the work presented here, one relevant construction is :
|
||||
|
||||
\<^item> \<^theory_text>\<open>typedef (\<alpha>\<^sub>1,...,\<alpha>\<^sub>n)t = E\<close>
|
||||
|
||||
|
||||
It creates a fresh type that is isomorphic to a set \<open>E\<close> involving \<open>\<alpha>\<^sub>1,...,\<alpha>\<^sub>n\<close> types.
|
||||
Isabelle/HOL performs a number of syntactic checks for these constructions that guarantee the logical
|
||||
|
@ -223,25 +211,23 @@ distribution comes with rich libraries comprising Sets, Numbers, Lists, etc. whi
|
|||
For this work, a particular library called \<^theory_text>\<open>HOLCF\<close> is intensively used. It provides classical
|
||||
domain theory for a particular type-class \<open>\<alpha>::pcpo\<close>, \<^ie> the class of types \<open>\<alpha>\<close> for which
|
||||
|
||||
\<^enum> a least element \<open>\<bottom>\<close> is defined, and
|
||||
\<^enum> a least element \<open>\<bottom>\<close> is defined, and
|
||||
\<^enum> a complete partial order \<open>_\<sqsubseteq>_\<close> is defined.
|
||||
|
||||
For these types, \<^theory_text>\<open>HOLCF\<close> provides a fixed-point operator \<open>\<mu>X. f X\<close> as well as the
|
||||
fixed-point induction and other (automated) proof infrastructure. Isabelle's type-inference can
|
||||
automatically infer, for example, that if \<open>\<alpha>::pcpo\<close>, then \<open>(\<beta> \<Rightarrow> \<alpha>)::pcpo\<close>. \<close>
|
||||
|
||||
section*["csphol"::tc,main_author="Some(@{docitem ''bu''}::author)", level="Some 2"]
|
||||
section*["csphol"::technical,main_author="Some(@{author ''bu''}::author)", level="Some 2"]
|
||||
\<open>Formalising Denotational \<^csp> Semantics in HOL \<close>
|
||||
|
||||
text\<open>\<close>
|
||||
|
||||
subsection*["processinv"::tc, main_author="Some(@{docitem ''bu''})"]
|
||||
subsection*["processinv"::technical, main_author="Some(@{author ''bu''})"]
|
||||
\<open>Process Invariant and Process Type\<close>
|
||||
text\<open> First, we need a slight revision of the concept
|
||||
of \<^emph>\<open>trace\<close>: if \<open>\<Sigma>\<close> is the type of the atomic events (represented by a type variable), then
|
||||
we need to extend this type by a special event \<open>\<surd>\<close> (called "tick") signaling termination.
|
||||
Thus, traces have the type \<open>(\<Sigma>+\<surd>)\<^sup>*\<close>, written \<open>\<Sigma>\<^sup>\<surd>\<^sup>*\<close>; since \<open>\<surd>\<close> may only occur at the end of a trace,
|
||||
we need to define a predicate \<open>front\<^sub>-tickFree t\<close> that requires from traces that \<open>\<surd>\<close> can only occur
|
||||
we need to extend this type by a special event \<open>\<checkmark>\<close> (called "tick") signaling termination.
|
||||
Thus, traces have the type \<open>(\<Sigma>\<uplus>\<checkmark>)\<^sup>*\<close>, written \<open>\<Sigma>\<^sup>\<checkmark>\<^sup>*\<close>; since \<open>\<checkmark>\<close> may only occur at the end of a trace,
|
||||
we need to define a predicate \<open>front\<^sub>-tickFree t\<close> that requires from traces that \<open>\<checkmark>\<close> can only occur
|
||||
at the end.
|
||||
|
||||
Second, in the traditional literature, the semantic domain is implicitly described by 9 "axioms"
|
||||
|
@ -256,38 +242,37 @@ Informally, these are:
|
|||
\<^item> the tick accepted after a trace \<open>s\<close> implies that all other events are refused;
|
||||
\<^item> a divergence trace with any suffix is itself a divergence one
|
||||
\<^item> once a process has diverged, it can engage in or refuse any sequence of events.
|
||||
\<^item> a trace ending with \<open>\<surd>\<close> belonging to divergence set implies that its
|
||||
maximum prefix without \<open>\<surd>\<close> is also a divergent trace.
|
||||
\<^item> a trace ending with \<open>\<checkmark>\<close> belonging to divergence set implies that its
|
||||
maximum prefix without \<open>\<checkmark>\<close> is also a divergent trace.
|
||||
|
||||
More formally, a process \<open>P\<close> of the type \<open>\<Sigma> process\<close> should have the following properties:
|
||||
|
||||
|
||||
@{cartouche [display] \<open>([],{}) \<in> \<F> P \<and>
|
||||
@{cartouche [display, indent=10] \<open>([],{}) \<in> \<F> P \<and>
|
||||
(\<forall> s X. (s,X) \<in> \<F> P \<longrightarrow> front_tickFree s) \<and>
|
||||
(\<forall> s t . (s@t,{}) \<in> \<F> P \<longrightarrow> (s,{}) \<in> \<F> P) \<and>
|
||||
(\<forall> s X Y. (s,Y) \<in> \<F> P \<and> X\<subseteq>Y \<longrightarrow> (s,X) \<in> \<F> P) \<and>
|
||||
(\<forall> s X Y. (s,X) \<in> \<F> P \<and> (\<forall>c \<in> Y. ((s@[c],{}) \<notin> \<F> P)) \<longrightarrow> (s,X \<union> Y) \<in> \<F> P) \<and>
|
||||
(\<forall> s X. (s@[\<surd>],{}) \<in> \<F> P \<longrightarrow> (s,X-{\<surd>}) \<in> \<F> P) \<and>
|
||||
(\<forall> s X. (s@[\<checkmark>],{}) \<in> \<F> P \<longrightarrow> (s,X-{\<checkmark>}) \<in> \<F> P) \<and>
|
||||
(\<forall> s t. s \<in> \<D> P \<and> tickFree s \<and> front_tickFree t \<longrightarrow> s@t \<in> \<D> P) \<and>
|
||||
(\<forall> s X. s \<in> \<D> P \<longrightarrow> (s,X) \<in> \<F> P) \<and>
|
||||
(\<forall> s. s@[\<surd>] \<in> \<D> P \<longrightarrow> s \<in> \<D> P)\<close>}
|
||||
(\<forall> s. s@[\<checkmark>] \<in> \<D> P \<longrightarrow> s \<in> \<D> P)\<close>}
|
||||
|
||||
Our objective is to encapsulate this wishlist into a type constructed as a conservative
|
||||
theory extension in our theory \<^holcsp>.
|
||||
Therefore third, we define a pre-type for processes \<open>\<Sigma> process\<^sub>0\<close> by \<open> \<P>(\<Sigma>\<^sup>\<surd>\<^sup>* \<times> \<P>(\<Sigma>\<^sup>\<surd>)) \<times> \<P>(\<Sigma>\<^sup>\<surd>)\<close>.
|
||||
Therefore third, we define a pre-type for processes \<open>\<Sigma> process\<^sub>0\<close> by \<open> \<P>(\<Sigma>\<^sup>\<checkmark>\<^sup>* \<times> \<P>(\<Sigma>\<^sup>\<checkmark>)) \<times> \<P>(\<Sigma>\<^sup>\<checkmark>)\<close>.
|
||||
Forth, we turn our wishlist of "axioms" above into the definition of a predicate \<open>is_process P\<close>
|
||||
of type \<open>\<Sigma> process\<^sub>0 \<Rightarrow> bool\<close> deciding if its conditions are fulfilled. Since \<open>P\<close> is a pre-process,
|
||||
we replace \<open>\<F>\<close> by \<open>fst\<close> and \<open>\<D>\<close> by \<open>snd\<close> (the HOL projections into a pair).
|
||||
And last not least fifth, we use the following type definition:
|
||||
\<^item> \<^theory_text>\<open>typedef '\<alpha> process = "{P :: '\<alpha> process\<^sub>0 . is_process P}"\<close>
|
||||
|
||||
\<^item> \<^theory_text>\<open>typedef '\<alpha> process = "{P :: '\<alpha> process\<^sub>0 . is_process P}"\<close>
|
||||
|
||||
Isabelle requires a proof for the existence of a witness for this set,
|
||||
but this can be constructed in a straight-forward manner. Suitable definitions for
|
||||
\<open>\<T>\<close>, \<open>\<F>\<close> and \<open>\<D>\<close> lifting \<open>fst\<close> and \<open>snd\<close> on the new \<open>'\<alpha> process\<close>-type allows to derive
|
||||
the above properties for any \<open>P::'\<alpha> process\<close>. \<close>
|
||||
|
||||
subsection*["operator"::tc, main_author="Some(@{docitem ''lina''})"]
|
||||
subsection*["operator"::technical, main_author="Some(@{author ''lina''})"]
|
||||
\<open>\<^csp> Operators over the Process Type\<close>
|
||||
text\<open> Now, the operators of \<^csp> \<open>Skip\<close>, \<open>Stop\<close>, \<open>_\<sqinter>_\<close>, \<open>_\<box>_\<close>, \<open>_\<rightarrow>_\<close>,\<open>_\<lbrakk>_\<rbrakk>_\<close> etc.
|
||||
for internal choice, external choice, prefix and parallel composition, can
|
||||
|
@ -301,17 +286,18 @@ For example, we define \<open>_\<sqinter>_\<close> on the pre-process type as fo
|
|||
|
||||
\<^item> \<^theory_text>\<open>definition "P \<sqinter> Q \<equiv> Abs_process(\<F> P \<union> \<F> Q , \<D> P \<union> \<D> Q)"\<close>
|
||||
|
||||
where \<open>\<F> = fst \<circ> Rep_process\<close> and \<open>\<D> = snd \<circ> Rep_process\<close> and where \<open>Rep_process\<close> and
|
||||
\<open>Abs_process\<close> are the representation and abstraction morphisms resulting from the
|
||||
type definition linking \<open>'\<alpha> process\<close> isomorphically to \<open>'\<alpha> process\<^sub>0\<close>. Proving the above properties
|
||||
for \<open>\<F> (P \<sqinter> Q)\<close> and \<open>\<D> (P \<sqinter> Q)\<close> requires a proof that \<open>(\<F> P \<union> \<F> Q , \<D> P \<union> \<D> Q)\<close>
|
||||
satisfies the 9 "axioms", which is fairly simple in this case.
|
||||
where \<open>Rep_process\<close> and \<open>Abs_process\<close> are the representation and abstraction morphisms resulting
|
||||
from the type definition linking the type \<open>'\<alpha> process\<close> isomorphically to the set \<open>'\<alpha> process\<^sub>0\<close>.
|
||||
The projection into \<^emph>\<open>failures\<close> is defined by \<open>\<F> = fst \<circ> Rep_process\<close>, whereas the
|
||||
\<^emph>\<open>divergences\<close> are defined bz \<open>\<D> = snd \<circ> Rep_process\<close>. Proving the above properties for
|
||||
\<open>\<F> (P \<sqinter> Q)\<close> and \<open>\<D> (P \<sqinter> Q)\<close> requires a proof that \<open>(\<F> P \<union> \<F> Q , \<D> P \<union> \<D> Q)\<close>
|
||||
satisfies the well-formedness conditions of \<open>is_process\<close>, which is fairly simple in this case.
|
||||
|
||||
The definitional presentation of the \<^csp> process operators according to @{cite "roscoe:csp:1998"}
|
||||
follows always this scheme. This part of the theory comprises around 2000 loc.
|
||||
\<close>
|
||||
|
||||
subsection*["orderings"::tc, main_author="Some(@{docitem ''bu''})"]
|
||||
subsection*["orderings"::technical, main_author="Some(@{author ''bu''})"]
|
||||
\<open>Refinement Orderings\<close>
|
||||
|
||||
text\<open> \<^csp> is centered around the idea of process refinement; many critical properties,
|
||||
|
@ -320,15 +306,16 @@ a conversion of processes in terms of (finite) labelled transition systems leads
|
|||
model-checking techniques based on graph-exploration. Essentially, a process \<open>P\<close> \<^emph>\<open>refines\<close>
|
||||
another process \<open>Q\<close> if and only if it is more deterministic and more defined (has less divergences).
|
||||
Consequently, each of the three semantics models (trace, failure and failure/divergence)
|
||||
has its corresponding refinement orderings.
|
||||
has its corresponding refinement orderings.\<close>
|
||||
Theorem*[th1::"theorem", short_name="\<open>Refinement properties\<close>"]\<open>
|
||||
What we are interested in this paper is the following refinement orderings for the
|
||||
failure/divergence model.
|
||||
|
||||
\<^enum> \<open>P \<sqsubseteq>\<^sub>\<F>\<^sub>\<D> Q \<equiv> \<F> P \<supseteq> \<F> Q \<and> \<D> P \<supseteq> \<D> Q\<close>
|
||||
\<^enum> \<open>P \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> Q \<equiv> \<T> P \<supseteq> \<T> Q \<and> \<D> P \<supseteq> \<D> Q\<close>
|
||||
\<^enum> \<open>P \<sqsubseteq>\<^sub>\<FF> Q \<equiv> \<FF> P \<supseteq> \<FF> Q, \<FF>\<in>{\<T>,\<F>,\<D>}\<close>
|
||||
\<^enum> \<open>P \<sqsubseteq>\<^sub>\<FF> Q \<equiv> \<FF> P \<supseteq> \<FF> Q, \<FF>\<in>{\<T>,\<F>,\<D>}\<close> \<close>
|
||||
|
||||
Notice that in the \<^csp> literature, only \<open>\<sqsubseteq>\<^sub>\<F>\<^sub>\<D>\<close> is well studied for failure/divergence model.
|
||||
text\<open> Notice that in the \<^csp> literature, only \<open>\<sqsubseteq>\<^sub>\<F>\<^sub>\<D>\<close> is well studied for failure/divergence model.
|
||||
Our formal analysis of different granularities on the refinement orderings
|
||||
allows deeper understanding of the same semantics model. For example, \<open>\<sqsubseteq>\<^sub>\<T>\<^sub>\<D>\<close> turns
|
||||
out to have in some cases better monotonicity properties and therefore allow for stronger proof
|
||||
|
@ -340,7 +327,7 @@ states, from which no internal progress is possible.
|
|||
\<close>
|
||||
|
||||
|
||||
subsection*["fixpoint"::tc, main_author="Some(@{docitem ''lina''})"]
|
||||
subsection*["fixpoint"::technical, main_author="Some(@{author ''lina''})"]
|
||||
\<open>Process Ordering and HOLCF\<close>
|
||||
text\<open> For any denotational semantics, the fixed point theory giving semantics to systems
|
||||
of recursive equations is considered as keystone. Its prerequisite is a complete partial ordering
|
||||
|
@ -352,17 +339,16 @@ Roscoe and Brooks @{cite "Roscoe1992AnAO"} finally proposed another ordering, ca
|
|||
that completeness could at least be assured for read-operations. This more complex ordering
|
||||
is based on the concept \<^emph>\<open>refusals after\<close> a trace \<open>s\<close> and defined by \<open>\<R> P s \<equiv> {X | (s, X) \<in> \<F> P}\<close>.\<close>
|
||||
|
||||
Definition*[process_ordering, short_name="''process ordering''"]\<open>
|
||||
Definition*[process_ordering, level= "Some 2", short_name="''process ordering''"]\<open>
|
||||
We define \<open>P \<sqsubseteq> Q \<equiv> \<psi>\<^sub>\<D> \<and> \<psi>\<^sub>\<R> \<and> \<psi>\<^sub>\<M> \<close>, where
|
||||
\<^enum> \<open>\<psi>\<^sub>\<D> = \<D> P \<supseteq> \<D> Q \<close>
|
||||
\<^enum> \<open>\<psi>\<^sub>\<D> = \<D> P \<supseteq> \<D> Q \<close>
|
||||
\<^enum> \<open>\<psi>\<^sub>\<R> = s \<notin> \<D> P \<Rightarrow> \<R> P s = \<R> Q s\<close>
|
||||
\<^enum> \<open>\<psi>\<^sub>\<M> = Mins(\<D> P) \<subseteq> \<T> Q \<close>
|
||||
\<close>
|
||||
\<^enum> \<open>\<psi>\<^sub>\<M> = Mins(\<D> P) \<subseteq> \<T> Q \<close> \<close>
|
||||
|
||||
text\<open>The third condition \<open>\<psi>\<^sub>\<M>\<close> implies that the set of minimal divergent traces
|
||||
(ones with no proper prefix that is also a divergence) in \<open>P\<close>, denoted by \<open>Mins(\<D> P)\<close>,
|
||||
should be a subset of the trace set of \<open>Q\<close>.
|
||||
%One may note that each element in \<open>Mins(\<D> P)\<close> do actually not contain the \<open>\<surd>\<close>,
|
||||
%One may note that each element in \<open>Mins(\<D> P)\<close> do actually not contain the \<open>\<checkmark>\<close>,
|
||||
%which can be deduced from the process invariants described
|
||||
%in the precedent @{technical "processinv"}. This can be explained by the fact that we are not
|
||||
%really concerned with what a process does after it terminates.
|
||||
|
@ -393,44 +379,45 @@ For most \<^csp> operators \<open>\<otimes>\<close> we derived rules of the form
|
|||
|
||||
These rules allow to automatically infer for any process term if it is continuous or not.
|
||||
The port of HOL-CSP 2 on HOLCF implied that the derivation of the entire continuity rules
|
||||
had to be completely re-done (3000 loc).
|
||||
|
||||
|
||||
HOL-CSP provides an important proof principle, the fixed-point induction:
|
||||
had to be completely re-done (3000 loc).\<close>
|
||||
|
||||
Theorem*[th2,short_name="\<open>Fixpoint Induction\<close>"]
|
||||
\<open>HOL-CSP provides an important proof principle, the fixed-point induction:
|
||||
@{cartouche [display, indent=5] \<open>cont f \<Longrightarrow> adm P \<Longrightarrow> P \<bottom> \<Longrightarrow> (\<And>X. P X \<Longrightarrow> P(f X)) \<Longrightarrow> P(\<mu>X. f X)\<close>}
|
||||
\<close>
|
||||
|
||||
Fixed-point induction requires a small side-calculus for establishing the admissibility
|
||||
text\<open>Fixed-point induction of @{theorem th2} requires a small side-calculus for establishing the admissibility
|
||||
of a predicate; basically, predicates are admissible if they are valid for any least upper bound
|
||||
of a chain \<open>x\<^sub>1 \<sqsubseteq> x\<^sub>2 \<sqsubseteq> x\<^sub>3 ... \<close> provided that \<open>\<forall>i. P(x\<^sub>i)\<close>. It turns out that \<open>_\<sqsubseteq>_\<close> and \<open>_\<sqsubseteq>\<^sub>F\<^sub>D_\<close> as
|
||||
well as all other refinement orderings that we introduce in this paper are admissible.
|
||||
Fixed-point inductions are the main proof weapon in verifications,
|
||||
together with monotonicities and the \<^csp> laws. Denotational arguments can be hidden as they are not
|
||||
needed in practical verifications. \<close>
|
||||
Fixed-point inductions are the main proof weapon in verifications, together with monotonicities
|
||||
and the \<^csp> laws. Denotational arguments can be hidden as they are not needed in practical
|
||||
verifications. \<close>
|
||||
|
||||
subsection*["law"::tc, main_author="Some(@{docitem ''lina''})"]
|
||||
subsection*["law"::technical, main_author="Some(@{author ''lina''})"]
|
||||
\<open>\<^csp> Rules: Improved Proofs and New Results\<close>
|
||||
|
||||
|
||||
text\<open> The \<^csp> operators enjoy a number of algebraic properties: commutativity,
|
||||
text\<open>The \<^csp> operators enjoy a number of algebraic properties: commutativity,
|
||||
associativities, and idempotence in some cases. Moreover, there is a rich body of distribution
|
||||
laws between these operators. Our new version HOL-CSP 2 not only shortens and restructures the
|
||||
proofs of @{cite "tej.ea:corrected:1997"}; the code reduces
|
||||
to 8000 loc from 25000 loc. Some illustrative examples of new established rules are:
|
||||
proofs of @{cite "tej.ea:corrected:1997"}; the code reduces to 8000 loc from 25000 loc. \<close>
|
||||
|
||||
Theorem*[th3, short_name="\<open>Examples of Derived Rules.\<close>"]\<open>
|
||||
\<^item> \<open>\<box>x\<in>A\<union>B\<rightarrow>P(x) = (\<box>x\<in>A\<rightarrow>P x) \<box> (\<box>x\<in>B\<rightarrow>P x)\<close>
|
||||
\<^item> \<open>A\<union>B\<subseteq>C \<Longrightarrow> (\<box>x\<in>A\<rightarrow>P x \<lbrakk>C\<rbrakk> \<box>x\<in>B\<rightarrow>Q x) = \<box>x\<in>A\<inter>B\<rightarrow>(P x \<lbrakk>C\<rbrakk> Q x)\<close>
|
||||
\<^item> @{cartouche [display]\<open>A\<subseteq>C \<Longrightarrow> B\<inter>C={} \<Longrightarrow>
|
||||
(\<box>x\<in>A\<rightarrow>P x \<lbrakk>C\<rbrakk> \<box>x\<in>B\<rightarrow>Q x) = \<box>x\<in>B\<rightarrow>(\<box>x\<in>A\<rightarrow>P x \<lbrakk>C\<rbrakk> Q x)\<close>}
|
||||
\<^item> \<open>finite A \<Longrightarrow> A\<inter>C = {} \<Longrightarrow> ((P \<lbrakk>C\<rbrakk> Q) \ A) = ((P \ A) \<lbrakk>C\<rbrakk> (Q \ A)) ...\<close>
|
||||
\<^item> \<open>finite A \<Longrightarrow> A\<inter>C = {} \<Longrightarrow> ((P \<lbrakk>C\<rbrakk> Q) \ A) = ((P \ A) \<lbrakk>C\<rbrakk> (Q \ A)) ...\<close>\<close>
|
||||
|
||||
The continuity proof of the hiding operator is notorious. The proof is known
|
||||
to involve the classical König's lemma stating that every infinite tree with finite branching
|
||||
has an infinite path. We adapt this lemma to our context as follows:
|
||||
|
||||
@{cartouche [display, indent=5]
|
||||
text\<open>The continuity proof of the hiding operator is notorious. The proof is known to involve the
|
||||
classical König's lemma stating that every infinite tree with finite branching has an infinite path.
|
||||
We adapt this lemma to our context as follows:
|
||||
|
||||
@{cartouche [display, indent=5]
|
||||
\<open>infinite tr \<Longrightarrow> \<forall>i. finite{t. \<exists>t'\<in>tr. t = take i t'}
|
||||
\<Longrightarrow> \<exists> f. strict_mono f \<and> range f \<subseteq> {t. \<exists>t'\<in>tr. t \<le> t'}\<close>}
|
||||
\<Longrightarrow> \<exists> f. strict_mono f \<and> range f \<subseteq> {t. \<exists>t'\<in>tr. t \<le> t'}\<close>}
|
||||
|
||||
in order to come up with the continuity rule: \<open>finite S \<Longrightarrow> cont P \<Longrightarrow> cont(\<lambda>X. P X \ S)\<close>.
|
||||
The original proof had been drastically shortened by a factor 10 and important immediate steps
|
||||
|
@ -449,12 +436,12 @@ cases to be considered as well as their complexity makes pen and paper proofs
|
|||
practically infeasible.
|
||||
\<close>
|
||||
|
||||
section*["newResults"::tc,main_author="Some(@{docitem ''safouan''}::author)",
|
||||
main_author="Some(@{docitem ''lina''}::author)", level= "Some 3"]
|
||||
section*["newResults"::technical,main_author="Some(@{author ''safouan''}::author)",
|
||||
main_author="Some(@{author ''lina''}::author)", level= "Some 3"]
|
||||
\<open>Theoretical Results on Refinement\<close>
|
||||
text\<open>\<close>
|
||||
subsection*["adm"::tc,main_author="Some(@{docitem ''safouan''}::author)",
|
||||
main_author="Some(@{docitem ''lina''}::author)"]
|
||||
subsection*["adm"::technical,main_author="Some(@{author ''safouan''}::author)",
|
||||
main_author="Some(@{author ''lina''}::author)"]
|
||||
\<open>Decomposition Rules\<close>
|
||||
text\<open>
|
||||
In our framework, we implemented the pcpo process refinement together with the five refinement
|
||||
|
@ -474,47 +461,23 @@ under all refinement orderings, while others are not.
|
|||
\<^item> Sequence operator is not monotonic under \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> or \<open>\<sqsubseteq>\<^sub>\<T>\<close>:
|
||||
@{cartouche [display,indent=5]
|
||||
\<open>P \<sqsubseteq>\<^sub>\<FF> P'\<Longrightarrow> Q \<sqsubseteq>\<^sub>\<FF> Q' \<Longrightarrow> (P ; Q) \<sqsubseteq>\<^sub>\<FF> (P' ; Q') where \<FF>\<in>{\<T>\<D>,\<F>\<D>}\<close>}
|
||||
%All refinements are right-side monotonic but \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close> are not left-side monotonic,
|
||||
%which can be explained by
|
||||
%the interdependence relationship of failure and divergence projections for the first component.
|
||||
%We thus proved:
|
||||
|
||||
All refinements are right-side monotonic but \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close> are not left-side monotonic,
|
||||
which can be explained by the interdependence relationship of failure and divergence projections
|
||||
for the first component. We thus proved:
|
||||
\<^item> Hiding operator is not monotonic under \<open>\<sqsubseteq>\<^sub>\<D>\<close>:
|
||||
@{cartouche [display,indent=5] \<open>P \<sqsubseteq>\<^sub>\<FF> Q \<Longrightarrow> P \ A \<sqsubseteq>\<^sub>\<FF> Q \ A where \<FF>\<in>{\<T>,\<F>,\<T>\<D>,\<F>\<D>}\<close>}
|
||||
%Intuitively, for the divergence refinement of the hiding operator, there may be
|
||||
%some trace \<open>s\<in>\<T> Q\<close> and \<open>s\<notin>\<T> P\<close> such that it becomes divergent in \<open>Q \ A\<close> but
|
||||
%not in \<open>P \ A\<close>.
|
||||
%when the condition in the corresponding projection laws is satisfied, which makes it is not monotonic.
|
||||
Intuitively, for the divergence refinement of the hiding operator, there may be
|
||||
some trace \<open>s\<in>\<T> Q\<close> and \<open>s\<notin>\<T> P\<close> such that it becomes divergent in \<open>Q \ A\<close> but
|
||||
not in \<open>P \ A\<close>.
|
||||
\<^item> Parallel composition is not monotonic under \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> or \<open>\<sqsubseteq>\<^sub>\<T>\<close>:
|
||||
@{cartouche [display,indent=5] \<open>P \<sqsubseteq>\<^sub>\<FF> P' \<Longrightarrow> Q \<sqsubseteq>\<^sub>\<FF> Q' \<Longrightarrow> (P \<lbrakk>A\<rbrakk> Q) \<sqsubseteq>\<^sub>\<FF> (P' \<lbrakk>A\<rbrakk> Q') where \<FF>\<in>{\<T>\<D>,\<F>\<D>}\<close>}
|
||||
%The failure and divergence projections of this operator are also interdependent, similar to the
|
||||
%sequence operator.
|
||||
%Hence, this operator is not monotonic with \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close>, but monotonic when their
|
||||
%combinations are considered.
|
||||
The failure and divergence projections of this operator are also interdependent, similar to the
|
||||
sequence operator. Hence, this operator is not monotonic with \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close>, but monotonic
|
||||
when their combinations are considered. \<close>
|
||||
|
||||
\<close>
|
||||
|
||||
(* Besides the monotonicity results on the above \<^csp> operators,
|
||||
we have also proved that for other \<^csp> operators, such as multi-prefix and non-deterministic choice,
|
||||
they are all monotonic with these five refinement orderings. Such theoretical results provide significant indicators
|
||||
for semantics choices when considering specification decomposition.
|
||||
We want to emphasize that this is the first work on such substantial
|
||||
analysis in a formal way, as far as we know.
|
||||
|
||||
%In the literature, these processes are defined in a way that does not distinguish the special event \<open>tick\<close>. To be consistent with the idea that ticks should be distinguished on the semantic level, besides the above
|
||||
three processes,
|
||||
|
||||
one can directly prove 3 since for both \<open>CHAOS\<close> and \<open>DF\<close>,
|
||||
the version with \<open>SKIP\<close> is constructed exactly in the same way from that without \<open>SKIP\<close>.
|
||||
And 4 is obtained based on the projection laws of internal choice \<open>\<sqinter>\<close>.
|
||||
Finally, for 5, the difference between \<open>DF\<close> and \<open>RUN\<close> is that the former applies internal choice
|
||||
while the latter with external choice. From the projection laws of both operators,
|
||||
the failure set of \<open>RUN\<close> has more constraints, thus being a subset of that of \<open>DF\<close>,
|
||||
when the divergence set is empty, which is true for both processes.
|
||||
|
||||
*)
|
||||
|
||||
subsection*["processes"::tc,main_author="Some(@{docitem ''safouan''}::author)",
|
||||
main_author="Some(@{docitem ''lina''}::author)"]
|
||||
subsection*["processes"::technical,main_author="Some(@{author ''safouan''}::author)",
|
||||
main_author="Some(@{author ''lina''}::author)"]
|
||||
\<open>Reference Processes and their Properties\<close>
|
||||
text\<open>
|
||||
We now present reference processes that exhibit basic behaviors, introduced in
|
||||
|
@ -528,10 +491,10 @@ To handle termination better, we added two new processes \<open>CHAOS\<^sub>S\<^
|
|||
\<close>
|
||||
|
||||
(*<*) (* a test ...*)
|
||||
text*[X22 ::math_content ]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||
text*[X32::"definition", mcc=defn]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X42]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X52::"definition"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
text*[X22 ::math_content, level="Some 2" ]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||
text*[X32::"definition", level="Some 2", mcc=defn]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X42, level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X52::"definition", level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
|
||||
text\<open> The \<open>RUN\<close>-process defined @{math_content X22} represents the process that accepts all
|
||||
events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in two variants shown in
|
||||
|
@ -539,51 +502,48 @@ events, but never stops nor deadlocks. The \<open>CHAOS\<close>-process comes in
|
|||
stops or accepts any offered event, whereas \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> can additionally terminate.\<close>
|
||||
(*>*)
|
||||
|
||||
Definition*[X2]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||
Definition*[X3]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X4]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close>\<close>
|
||||
Definition*[X5]\<open>\<open>DF A \<equiv> \<mu> X. (\<sqinter> x \<in> A \<rightarrow> X)\<close> \<close>
|
||||
Definition*[X6]\<open>\<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. ((\<sqinter> x \<in> A \<rightarrow> X) \<sqinter> SKIP)\<close> \<close>
|
||||
Definition*[X2, level="Some 2"]\<open>\<open>RUN A \<equiv> \<mu> X. \<box> x \<in> A \<rightarrow> X\<close> \<close>
|
||||
Definition*[X3, level="Some 2"]\<open>\<open>CHAOS A \<equiv> \<mu> X. (STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close> \<close>
|
||||
Definition*[X4, level="Some 2"]\<open>\<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. (SKIP \<sqinter> STOP \<sqinter> (\<box> x \<in> A \<rightarrow> X))\<close>\<close>
|
||||
Definition*[X5, level="Some 2"]\<open>\<open>DF A \<equiv> \<mu> X. (\<sqinter> x \<in> A \<rightarrow> X)\<close> \<close>
|
||||
Definition*[X6, level="Some 2"]\<open>\<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<equiv> \<mu> X. ((\<sqinter> x \<in> A \<rightarrow> X) \<sqinter> SKIP)\<close> \<close>
|
||||
|
||||
text\<open>In the following, we denote \<open> \<R>\<P> = {DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P, DF, RUN, CHAOS, CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P}\<close>.
|
||||
All five reference processes are divergence-free.
|
||||
%which was done by using a particular lemma \<open>\<D> (\<mu> x. f x) = \<Inter>\<^sub>i\<^sub>\<in>\<^sub>\<nat> \<D> (f\<^sup>i \<bottom>)\<close>.
|
||||
which was proven by using a particular lemma \<open>\<D> (\<mu> x. f x) = \<Inter>\<^sub>i\<^sub>\<in>\<^sub>\<nat> \<D> (f\<^sup>i \<bottom>)\<close>.
|
||||
@{cartouche
|
||||
[display,indent=8] \<open> D (\<PP> UNIV) = {} where \<PP> \<in> \<R>\<P> and UNIV is the set of all events\<close>
|
||||
}
|
||||
Regarding the failure refinement ordering, the set of failures \<open>\<F> P\<close> for any process \<open>P\<close> is
|
||||
a subset of \<open>\<F> (CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV)\<close>.% and the following lemma was proved:
|
||||
% This proof is performed by induction, based on the failure projection of \<open>STOP\<close> and that of
|
||||
% internal choice.
|
||||
|
||||
a subset of \<open>\<F> (CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV)\<close>.
|
||||
|
||||
@{cartouche [display, indent=25] \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close>}
|
||||
|
||||
|
||||
\<^noindent> Furthermore, the following 5 relationships were demonstrated from monotonicity results and
|
||||
a denotational proof.
|
||||
%among which 1 and 2 are immediate corollaries,
|
||||
%4 and 5 are directly obtained from our monotonicity results while 3 requires a denotational proof.
|
||||
and thanks to transitivity, we can derive other relationships.
|
||||
Furthermore, the following 5 relationships were demonstrated from monotonicity results and
|
||||
a denotational proof.
|
||||
\<close>
|
||||
|
||||
|
||||
Corollary*[co1::"corollary", short_name="\<open>Corollaries on reference processes.\<close>",level="Some 2"]
|
||||
\<open> \<^hfill> \<^br> \<^vs>\<open>-0.3cm\<close>
|
||||
\<^enum> \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<sqsubseteq>\<^sub>\<F> CHAOS A\<close>
|
||||
\<^enum> \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<sqsubseteq>\<^sub>\<F> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A\<close>
|
||||
\<^enum> \<open>CHAOS A \<sqsubseteq>\<^sub>\<F> DF A\<close>
|
||||
\<^enum> \<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P A \<sqsubseteq>\<^sub>\<F> DF A\<close>
|
||||
\<^enum> \<open>DF A \<sqsubseteq>\<^sub>\<F> RUN A\<close>
|
||||
\<^enum> \<open>DF A \<sqsubseteq>\<^sub>\<F> RUN A\<close> \<^vs>\<open>0.3cm\<close>
|
||||
|
||||
where 1 and 2 are immediate, and where 4 and 5 are directly obtained from our monotonicity
|
||||
results while 3 requires an argument over the denotational space.
|
||||
Thanks to transitivity, we can derive other relationships.\<close>
|
||||
|
||||
Last, regarding trace refinement, for any process P,
|
||||
text\<open> Lastly, regarding trace refinement, for any process P,
|
||||
its set of traces \<open>\<T> P\<close> is a subset of \<open>\<T> (CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV)\<close> and of \<open>\<T> (DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV)\<close> as well.
|
||||
%As we already proved that \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> covers all failures,
|
||||
%we can immediately infer that it also covers all traces.
|
||||
%The \<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P\<close> case requires a longer denotational proof.
|
||||
|
||||
|
||||
\<^enum> \<open>CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T> P\<close>
|
||||
\<^enum> \<open>DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T> P\<close>
|
||||
|
||||
\<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -596,39 +556,34 @@ verification. For example, if one wants to establish that a protocol implementat
|
|||
a non-deterministic specification \<open>SPEC\<close> it suffices to ask if \<open>IMPL || SPEC\<close> is deadlock-free.
|
||||
In this setting, \<open>SPEC\<close> becomes a kind of observer that signals non-conformance of \<open>IMPL\<close> by
|
||||
deadlock.
|
||||
% A livelocked system looks similar to a deadlocked one from an external point of view.
|
||||
% However, livelock is sometimes considered as worse since the user may be able to observe the internal
|
||||
% activities and so hope that some output will happen eventually.
|
||||
|
||||
In the literature, deadlock and lifelock are phenomena that are often
|
||||
handled separately. One contribution of our work is establish their precise relationship inside
|
||||
the Failure/Divergence Semantics of \<^csp>.\<close>
|
||||
|
||||
(* bizarre: Definition* does not work for this single case *)
|
||||
text*[X10::"definition"]\<open> \<open>deadlock\<^sub>-free P \<equiv> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close> \<close>
|
||||
Definition*[X10::"definition", level="Some 2"]\<open> \<open>deadlock\<^sub>-free P \<equiv> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F> P\<close> \<close>
|
||||
|
||||
text\<open>\<^noindent> A process \<open>P\<close> is deadlock-free if and only if after any trace \<open>s\<close> without \<open>\<surd>\<close>, the union of \<open>\<surd>\<close>
|
||||
text\<open>\<^noindent> A process \<open>P\<close> is deadlock-free if and only if after any trace \<open>s\<close> without \<open>\<checkmark>\<close>, the union of \<open>\<checkmark>\<close>
|
||||
and all events of \<open>P\<close> can never be a refusal set associated to \<open>s\<close>, which means that \<open>P\<close> cannot
|
||||
be deadlocked after any non-terminating trace.
|
||||
\<close>
|
||||
|
||||
Theorem*[T1, short_name="\<open>DF definition captures deadlock-freeness\<close>"]
|
||||
\<open> \hfill \break \<open>deadlock_free P \<longleftrightarrow> (\<forall>s\<in>\<T> P. tickFree s \<longrightarrow> (s, {\<surd>}\<union>events_of P) \<notin> \<F> P)\<close> \<close>
|
||||
Definition*[X11]\<open> \<open>livelock\<^sub>-free P \<equiv> \<D> P = {} \<close> \<close>
|
||||
Theorem*[T1, short_name="\<open>DF definition captures deadlock-freeness\<close>", level="Some 2"]
|
||||
\<open> \<^hfill> \<^br> \<open>deadlock_free P \<longleftrightarrow> (\<forall>s\<in>\<T> P. tickFree s \<longrightarrow> (s, {\<checkmark>}\<union>events_of P) \<notin> \<F> P)\<close> \<close>
|
||||
Definition*[X11, level="Some 2"]\<open> \<open>livelock\<^sub>-free P \<equiv> \<D> P = {} \<close> \<close>
|
||||
|
||||
text\<open> Recall that all five reference processes are livelock-free.
|
||||
We also have the following lemmas about the
|
||||
livelock-freeness of processes:
|
||||
\<^enum> \<open>livelock\<^sub>-free P \<longleftrightarrow> \<PP> UNIV \<sqsubseteq>\<^sub>\<D> P where \<PP> \<in> \<R>\<P>\<close>
|
||||
\<^enum> @{cartouche [display]\<open>livelock\<^sub>-free P \<longleftrightarrow> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> P
|
||||
\<longleftrightarrow> CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> P\<close>}
|
||||
\<^enum> \<open>livelock\<^sub>-free P \<longleftrightarrow> DF\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> P \<longleftrightarrow> CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<T>\<^sub>\<D> P\<close>
|
||||
\<^enum> \<open>livelock\<^sub>-free P \<longleftrightarrow> CHAOS\<^sub>S\<^sub>K\<^sub>I\<^sub>P UNIV \<sqsubseteq>\<^sub>\<F>\<^sub>\<D> P\<close>
|
||||
\<close>
|
||||
text\<open>
|
||||
Finally, we proved the following theorem that confirms the relationship between the two vital
|
||||
properties:
|
||||
\<close>
|
||||
Theorem*[T2, short_name="''DF implies LF''"]
|
||||
Theorem*[T2, short_name="''DF implies LF''", level="Some 2"]
|
||||
\<open> \<open>deadlock_free P \<longrightarrow> livelock_free P\<close> \<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -642,11 +597,11 @@ then it may still be livelock-free. % This makes sense since livelocks are worse
|
|||
|
||||
\<close>
|
||||
|
||||
section*["advanced"::tc,main_author="Some(@{docitem ''safouan''}::author)",level="Some 3"]
|
||||
section*["advanced"::technical,main_author="Some(@{author ''safouan''}::author)",level="Some 3"]
|
||||
\<open>Advanced Verification Techniques\<close>
|
||||
|
||||
text\<open>
|
||||
Based on the refinement framework discussed in @{docitem "newResults"}, we will now
|
||||
Based on the refinement framework discussed in @{technical "newResults"}, we will now
|
||||
turn to some more advanced proof principles, tactics and verification techniques.
|
||||
We will demonstrate them on two paradigmatic examples well-known in the \<^csp> literature:
|
||||
The CopyBuffer and Dijkstra's Dining Philosophers. In both cases, we will exploit
|
||||
|
@ -657,7 +612,7 @@ verification. In the latter case, we present an approach to a verification of a
|
|||
architecture, in this case a ring-structure of arbitrary size.
|
||||
\<close>
|
||||
|
||||
subsection*["illustration"::tc,main_author="Some(@{docitem ''safouan''}::author)", level="Some 3"]
|
||||
subsection*["illustration"::technical,main_author="Some(@{author ''safouan''}::author)", level="Some 3"]
|
||||
\<open>The General CopyBuffer Example\<close>
|
||||
text\<open>
|
||||
We consider the paradigmatic copy buffer example @{cite "Hoare:1985:CSP:3921" and "Roscoe:UCS:2010"}
|
||||
|
@ -705,7 +660,7 @@ of 2 lines proof-script involving the derived algebraic laws of \<^csp>.
|
|||
|
||||
After proving that \<open>SYSTEM\<close> implements \<open>COPY\<close> for arbitrary alphabets, we aim to profit from this
|
||||
first established result to check which relations \<open>SYSTEM\<close> has wrt. to the reference processes of
|
||||
@{docitem "processes"}. Thus, we prove that \<open>COPY\<close> is deadlock-free which implies livelock-free,
|
||||
@{technical "processes"}. Thus, we prove that \<open>COPY\<close> is deadlock-free which implies livelock-free,
|
||||
(proof by fixed-induction similar to \<open>lemma: COPY \<sqsubseteq> SYSTEM\<close>), from which we can immediately infer
|
||||
from transitivity that \<open>SYSTEM\<close> is. Using refinement relations, we killed four birds with one stone
|
||||
as we proved the deadlock-freeness and the livelock-freeness for both \<open>COPY\<close> and \<open>SYSTEM\<close> processes.
|
||||
|
@ -722,7 +677,7 @@ corollary deadlock_free COPY
|
|||
\<close>
|
||||
|
||||
|
||||
subsection*["inductions"::tc,main_author="Some(@{docitem ''safouan''}::author)"]
|
||||
subsection*["inductions"::technical,main_author="Some(@{author ''safouan''}::author)"]
|
||||
\<open>New Fixed-Point Inductions\<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -739,9 +694,8 @@ For this reason, we derived a number of alternative induction schemes (which are
|
|||
in the HOLCF library), which are also relevant for our final Dining Philophers example.
|
||||
These are essentially adaptions of k-induction schemes applied to domain-theoretic
|
||||
setting (so: requiring \<open>f\<close> continuous and \<open>P\<close> admissible; these preconditions are
|
||||
skipped here):
|
||||
\<^item> @{cartouche [display]\<open>... \<Longrightarrow> \<forall>i<k. P (f\<^sup>i \<bottom>) \<Longrightarrow> (\<forall>X. (\<forall>i<k. P (f\<^sup>i X)) \<longrightarrow> P (f\<^sup>k X))
|
||||
\<Longrightarrow> P (\<mu>X. f X)\<close>}
|
||||
skipped here):\<^vs>\<open>0.2cm\<close>
|
||||
\<^item> \<open>... \<Longrightarrow> \<forall>i<k. P (f\<^sup>i \<bottom>) \<Longrightarrow> (\<forall>X. (\<forall>i<k. P (f\<^sup>i X)) \<longrightarrow> P (f\<^sup>k X)) \<Longrightarrow> P (\<mu>X. f X)\<close>
|
||||
\<^item> \<open>... \<Longrightarrow> \<forall>i<k. P (f\<^sup>i \<bottom>) \<Longrightarrow> (\<forall>X. P X \<longrightarrow> P (f\<^sup>k X)) \<Longrightarrow> P (\<mu>X. f X)\<close>
|
||||
|
||||
|
||||
|
@ -749,10 +703,9 @@ skipped here):
|
|||
it reduces the goal size.
|
||||
|
||||
Another problem occasionally occurring in refinement proofs happens when the right side term
|
||||
involves more than one fixed-point process (\<^eg> \<open>P \<lbrakk>{A}\<rbrakk> Q \<sqsubseteq> S\<close>). In this situation,
|
||||
involves more than one fixed-point process (\<^eg> \<open>P \<lbrakk>A\<rbrakk> Q \<sqsubseteq> S\<close>). In this situation,
|
||||
we need parallel fixed-point inductions. The HOLCF library offers only a basic one:
|
||||
\<^item> @{cartouche [display]\<open>... \<Longrightarrow> P \<bottom> \<bottom> \<Longrightarrow> (\<forall>X Y. P X Y \<Longrightarrow> P (f X) (g Y))
|
||||
\<Longrightarrow> P (\<mu>X. f X) (\<mu>X. g X)\<close>}
|
||||
\<^item> \<open>... \<Longrightarrow> P \<bottom> \<bottom> \<Longrightarrow> (\<forall>X Y. P X Y \<Longrightarrow> P (f X) (g Y)) \<Longrightarrow> P (\<mu>X. f X) (\<mu>X. g X)\<close>
|
||||
|
||||
|
||||
\<^noindent> This form does not help in cases like in \<open>P \<lbrakk>\<emptyset>\<rbrakk> Q \<sqsubseteq> S\<close> with the interleaving operator on the
|
||||
|
@ -774,7 +727,7 @@ The astute reader may notice here that if the induction step is weakened (having
|
|||
the base steps require enforcement.
|
||||
\<close>
|
||||
|
||||
subsection*["norm"::tc,main_author="Some(@{docitem ''safouan''}::author)"]
|
||||
subsection*["norm"::technical,main_author="Some(@{author ''safouan''}::author)"]
|
||||
\<open>Normalization\<close>
|
||||
text\<open>
|
||||
Our framework can reason not only over infinite alphabets, but also over processes parameterized
|
||||
|
@ -795,7 +748,7 @@ This normal form is closed under deterministic and communication operators.
|
|||
The advantage of this format is that we can mimick the well-known product automata construction
|
||||
for an arbitrary number of synchronized processes under normal form.
|
||||
We only show the case of the synchronous product of two processes: \<close>
|
||||
text*[T3::"theorem", short_name="\<open>Product Construction\<close>"]\<open>
|
||||
Theorem*[T3, short_name="\<open>Product Construction\<close>", level="Some 2"]\<open>
|
||||
Parallel composition translates to normal form:
|
||||
@{cartouche [display,indent=5]\<open>(P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>1,\<upsilon>\<^sub>1\<rbrakk> \<sigma>\<^sub>1) || (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>\<^sub>2,\<upsilon>\<^sub>2\<rbrakk> \<sigma>\<^sub>2) =
|
||||
P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2). \<tau>\<^sub>1 \<sigma>\<^sub>1 \<inter> \<tau>\<^sub>2 \<sigma>\<^sub>2 , \<lambda>(\<sigma>\<^sub>1,\<sigma>\<^sub>2).\<lambda>e.(\<upsilon>\<^sub>1 \<sigma>\<^sub>1 e, \<upsilon>\<^sub>2 \<sigma>\<^sub>2 e)\<rbrakk> (\<sigma>\<^sub>1,\<sigma>\<^sub>2)\<close>}
|
||||
|
@ -815,7 +768,7 @@ states via the closure \<open>\<RR>\<close>, which is defined inductively over:
|
|||
Thus, normalization leads to a new characterization of deadlock-freeness inspired
|
||||
from automata theory. We formally proved the following theorem:\<close>
|
||||
|
||||
text*[T4::"theorem", short_name="\<open>DF vs. Reacheability\<close>"]
|
||||
text*[T4::"theorem", short_name="\<open>DF vs. Reacheability\<close>", level="Some 2"]
|
||||
\<open> If each reachable state \<open>s \<in> (\<RR> \<tau> \<upsilon>)\<close> has outgoing transitions,
|
||||
the \<^csp> process is deadlock-free:
|
||||
@{cartouche [display,indent=10] \<open>\<forall>\<sigma> \<in> (\<RR> \<tau> \<upsilon> \<sigma>\<^sub>0). \<tau> \<sigma> \<noteq> {} \<Longrightarrow> deadlock_free (P\<^sub>n\<^sub>o\<^sub>r\<^sub>m\<lbrakk>\<tau>,\<upsilon>\<rbrakk> \<sigma>\<^sub>0)\<close>}
|
||||
|
@ -834,7 +787,7 @@ Summing up, our method consists of four stages:
|
|||
|
||||
\<close>
|
||||
|
||||
subsection*["dining_philosophers"::tc,main_author="Some(@{docitem ''safouan''}::author)",level="Some 3"]
|
||||
subsection*["dining_philosophers"::technical,main_author="Some(@{author ''safouan''}::author)",level="Some 3"]
|
||||
\<open>Generalized Dining Philosophers\<close>
|
||||
|
||||
text\<open> The dining philosophers problem is another paradigmatic example in the \<^csp> literature
|
||||
|
@ -926,7 +879,7 @@ for a dozen of philosophers (on a usual machine) due to the exponential combinat
|
|||
Furthermore, our proof is fairly stable against modifications like adding non synchronized events like
|
||||
thinking or sitting down in contrast to model-checking techniques. \<close>
|
||||
|
||||
section*["relatedwork"::tc,main_author="Some(@{docitem ''lina''}::author)",level="Some 3"]
|
||||
section*["relatedwork"::technical,main_author="Some(@{author ''lina''}::author)",level="Some 3"]
|
||||
\<open>Related work\<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -993,7 +946,7 @@ restrictions on the structure of components. None of our paradigmatic examples c
|
|||
be automatically proven with any of the discussed SMT techniques without restrictions.
|
||||
\<close>
|
||||
|
||||
section*["conclusion"::conclusion,main_author="Some(@{docitem ''bu''}::author)"]\<open>Conclusion\<close>
|
||||
section*["conclusion"::conclusion,main_author="Some(@{author ''bu''}::author)"]\<open>Conclusion\<close>
|
||||
text\<open>We presented a formalisation of the most comprehensive semantic model for \<^csp>, a 'classical'
|
||||
language for the specification and analysis of concurrent systems studied in a rich body of
|
||||
literature. For this purpose, we ported @{cite "tej.ea:corrected:1997"} to a modern version
|
||||
|
@ -1024,10 +977,6 @@ over finite sub-systems with globally infinite systems in a logically safe way.
|
|||
subsection*[bib::bibliography]\<open>References\<close>
|
||||
|
||||
close_monitor*[this]
|
||||
(*
|
||||
term\<open>\<longrightarrow>\<close>
|
||||
term\<open> demon \<sigma>\<^sub>g\<^sub>l\<^sub>o\<^sub>b\<^sub>a\<^sub>l := \<Sqinter> \<Delta>t \<in> \<real>\<^sub>>\<^sub>0. ||| i\<in>A. ACTOR i \<sigma>\<^sub>g\<^sub>l\<^sub>o\<^sub>b\<^sub>a\<^sub>l
|
||||
\<lbrakk>S\<rbrakk> sync!\<sigma>\<^sub>g\<^sub>l\<^sub>o\<^sub>b\<^sub>a\<^sub>l\<^sub>' \<longrightarrow> demon \<sigma>\<^sub>g\<^sub>l\<^sub>o\<^sub>b\<^sub>a\<^sub>l\<^sub>' \<close>
|
||||
*)
|
||||
|
||||
end
|
||||
(*>*)
|
|
@ -1,7 +1,6 @@
|
|||
theory PikeOS_ST (*Security Target *)
|
||||
|
||||
imports "../../../src/ontologies/CC_v3.1_R5/CC_v3_1_R5"
|
||||
(* Isabelle_DOF.CC_v3_1_R5 in the future. *)
|
||||
imports "Isabelle_DOF-Ontologies.CC_v3_1_R5"
|
||||
|
||||
begin
|
||||
|
||||
|
@ -18,18 +17,20 @@ text*[pkosstref::st_ref_cls, title="''PikeOS Security Target''", st_version ="(0
|
|||
It complies with the Common Criteria for Information Technology Security Evaluation
|
||||
Version 3.1 Revision 4.\<close>
|
||||
|
||||
|
||||
|
||||
subsection*[pkossttoerefsubsec::st_ref_cls]\<open>TOE Reference\<close>
|
||||
|
||||
text*[pkostoeref::toe_ref_cls, dev_name="''''", toe_name="''PikeOS''",
|
||||
toe_version= "(0,3,4)", prod_name="Some ''S3725''"]
|
||||
\<open>The @{docitem toe_def} is the operating system PikeOS version 3.4
|
||||
\<open>The @{docitem (unchecked) toeDef} is the operating system PikeOS version 3.4
|
||||
running on the microprocessor family x86 hosting different applications.
|
||||
The @{docitem toe_def} is referenced as PikeOS 3.4 base
|
||||
The @{docitem (unchecked) toeDef} is referenced as PikeOS 3.4 base
|
||||
product build S3725 for Linux and Windows development host with PikeOS 3.4
|
||||
Certification Kit build S4250 and PikeOS 3.4 Common Criteria Kit build S4388.\<close>
|
||||
|
||||
subsection*[pkossttoeovrvwsubsec::st_ref_cls]\<open> TOE Overview \<close>
|
||||
text*[pkosovrw1::toe_ovrw_cls]\<open>The @{definition \<open>toe\<close> } is a special kind of operating
|
||||
text*[pkosovrw1::toe_ovrw_cls]\<open>The @{docitem (unchecked) \<open>toeDef\<close> } is a special kind of operating
|
||||
system, that allows to effectively separate
|
||||
different applications running on the same platform from each other. The TOE can host
|
||||
user applications that can also be operating systems. User applications can also be
|
||||
|
@ -87,4 +88,4 @@ open_monitor*[PikosSR::SEC_REQ_MNT]
|
|||
close_monitor*[PikosSR]
|
||||
|
||||
close_monitor*[stpkos]
|
||||
end
|
||||
end
|
|
@ -0,0 +1,4 @@
|
|||
session "PikeOS_study" = "Isabelle_DOF-Ontologies" +
|
||||
options [document = false]
|
||||
theories
|
||||
"PikeOS_ST"
|
|
@ -0,0 +1 @@
|
|||
PikeOS_study
|
|
@ -1,16 +1,16 @@
|
|||
session "mini_odo" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof,
|
||||
dof_ontologies = "Isabelle_DOF.technical_report Isabelle_DOF.cenelec_50128",
|
||||
dof_template = "Isabelle_DOF.scrreprt-modern"]
|
||||
session "mini_odo" = "Isabelle_DOF-Ontologies" +
|
||||
options [document = pdf, document_output = "output", document_build = dof]
|
||||
sessions
|
||||
"Physical_Quantities"
|
||||
theories
|
||||
"mini_odo"
|
||||
document_theories
|
||||
"Isabelle_DOF-Ontologies.CENELEC_50128"
|
||||
document_files
|
||||
"dof_session.tex"
|
||||
"preamble.tex"
|
||||
"root.bib"
|
||||
"root.mst"
|
||||
"lstisadof.sty"
|
||||
"figures/df-numerics-encshaft.png"
|
||||
"figures/odometer.jpeg"
|
||||
"figures/three-phase-odo.pdf"
|
|
@ -0,0 +1,3 @@
|
|||
\input{mini_odo}
|
||||
\input{CENELEC_50128}
|
||||
|
Before Width: | Height: | Size: 27 KiB After Width: | Height: | Size: 27 KiB |
Before Width: | Height: | Size: 407 KiB After Width: | Height: | Size: 407 KiB |
Before Width: | Height: | Size: 23 KiB After Width: | Height: | Size: 23 KiB |
|
@ -13,8 +13,6 @@
|
|||
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
|
||||
|
||||
%% This is a placeholder for user-specific configuration and packages.
|
||||
\usepackage{listings}
|
||||
\usepackage{lstisadof}
|
||||
\usepackage{wrapfig}
|
||||
\usepackage{paralist}
|
||||
\usepackage{numprint}
|
|
@ -15,10 +15,12 @@
|
|||
theory
|
||||
mini_odo
|
||||
imports
|
||||
"Isabelle_DOF.CENELEC_50128"
|
||||
"Isabelle_DOF-Ontologies.CENELEC_50128"
|
||||
"Isabelle_DOF.technical_report"
|
||||
"Physical_Quantities.SI" "Physical_Quantities.SI_Pretty"
|
||||
begin
|
||||
use_template "scrreprt-modern"
|
||||
use_ontology technical_report and "Isabelle_DOF-Ontologies.CENELEC_50128"
|
||||
declare[[strict_monitor_checking=true]]
|
||||
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
|
||||
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
|
||||
|
@ -100,13 +102,13 @@ text\<open>
|
|||
functioning of the system and for its integration into the system as a whole. In
|
||||
particular, we need to make the following assumptions explicit: \<^vs>\<open>-0.3cm\<close>\<close>
|
||||
|
||||
text*["perfect-wheel"::assumption]
|
||||
text*["perfect_wheel"::assumption]
|
||||
\<open>\<^item> the wheel is perfectly circular with a given, constant radius. \<^vs>\<open>-0.3cm\<close>\<close>
|
||||
text*["no-slip"::assumption]
|
||||
text*["no_slip"::assumption]
|
||||
\<open>\<^item> the slip between the trains wheel and the track negligible. \<^vs>\<open>-0.3cm\<close>\<close>
|
||||
text*["constant-teeth-dist"::assumption]
|
||||
text*["constant_teeth_dist"::assumption]
|
||||
\<open>\<^item> the distance between all teeth of a wheel is the same and constant, and \<^vs>\<open>-0.3cm\<close>\<close>
|
||||
text*["constant-sampling-rate"::assumption]
|
||||
text*["constant_sampling_rate"::assumption]
|
||||
\<open>\<^item> the sampling rate of positions is a given constant.\<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -124,13 +126,13 @@ text\<open>
|
|||
|
||||
subsection\<open>Capturing ``System Architecture.''\<close>
|
||||
|
||||
figure*["three-phase"::figure,relative_width="70",src="''figures/three-phase-odo''"]
|
||||
figure*["three_phase"::figure,relative_width="70",file_src="''figures/three-phase-odo.pdf''"]
|
||||
\<open>An odometer with three sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close>.\<close>
|
||||
|
||||
text\<open>
|
||||
The requirements analysis also contains a document \<^doc_class>\<open>SYSAD\<close>
|
||||
(\<^typ>\<open>system_architecture_description\<close>) that contains technical drawing of the odometer,
|
||||
a timing diagram (see \<^figure>\<open>three-phase\<close>), and tables describing the encoding of the position
|
||||
a timing diagram (see \<^figure>\<open>three_phase\<close>), and tables describing the encoding of the position
|
||||
for the possible signal transitions of the sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close>.
|
||||
\<close>
|
||||
|
||||
|
@ -144,7 +146,7 @@ text\<open>
|
|||
sub-system configuration. \<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["df-numerics-encshaft"::figure]
|
||||
declare_reference*["df_numerics_encshaft"::figure]
|
||||
(*>*)
|
||||
subsection\<open>Capturing ``Required Performances.''\<close>
|
||||
text\<open>
|
||||
|
@ -158,9 +160,9 @@ text\<open>
|
|||
|
||||
The requirement analysis document describes the physical environment, the architecture
|
||||
of the measuring device, and the required format and precision of the measurements of the odometry
|
||||
function as represented (see @{figure (unchecked) "df-numerics-encshaft"}).\<close>
|
||||
function as represented (see @{figure (unchecked) "df_numerics_encshaft"}).\<close>
|
||||
|
||||
figure*["df-numerics-encshaft"::figure,relative_width="76",src="''figures/df-numerics-encshaft''"]
|
||||
figure*["df_numerics_encshaft"::figure,relative_width="76",file_src="''figures/df-numerics-encshaft.png''"]
|
||||
\<open>Real distance vs. discrete distance vs. shaft-encoder sequence\<close>
|
||||
|
||||
|
||||
|
@ -213,7 +215,7 @@ text\<open>
|
|||
concepts such as Cauchy Sequences, limits, differentiability, and a very substantial part of
|
||||
classical Calculus. \<open>SOME\<close> is the Hilbert choice operator from HOL; the definitions of the
|
||||
model parameters admit all possible positive values as uninterpreted constants. Our
|
||||
\<^assumption>\<open>perfect-wheel\<close> is translated into a calculation of the circumference of the
|
||||
\<^assumption>\<open>perfect_wheel\<close> is translated into a calculation of the circumference of the
|
||||
wheel, while \<open>\<delta>s\<^sub>r\<^sub>e\<^sub>s\<close>, the resolution of the odometer, can be calculated
|
||||
from the these parameters. HOL-Analysis permits to formalize the fundamental physical observables:
|
||||
\<close>
|
||||
|
@ -294,7 +296,7 @@ and the global model parameters such as wheel diameter, the number of teeth per
|
|||
sampling frequency etc., we can infer the maximal time of service as well the maximum distance
|
||||
the device can measure. As an example configuration, choosing:
|
||||
|
||||
\<^item> \<^term>\<open>(1 *\<^sub>Q metre)::real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
|
||||
\<^item> \<^term>\<open>(1 *\<^sub>Q metre):: real[m]\<close> for \<^term>\<open>w\<^sub>d\<close> (wheel-diameter),
|
||||
\<^item> \<^term>\<open>100 :: real\<close> for \<^term>\<open>tpw\<close> (teeth per wheel),
|
||||
\<^item> \<^term>\<open>80 *\<^sub>Q kmh :: real[m\<cdot>s\<^sup>-\<^sup>1]\<close> for \<^term>\<open>Speed\<^sub>M\<^sub>a\<^sub>x\<close>,
|
||||
\<^item> \<^term>\<open>14.4 *\<^sub>Q kHz :: real[s\<^sup>-\<^sup>1]\<close> for the sampling frequency,
|
||||
|
@ -626,14 +628,14 @@ text\<open>
|
|||
\<close>
|
||||
|
||||
text\<open>Examples for declaration of typed doc-classes "assumption" (sic!) and "hypothesis" (sic!!),
|
||||
concepts defined in the underlying ontology @{theory "Isabelle_DOF.CENELEC_50128"}. \<close>
|
||||
concepts defined in the underlying ontology @{theory "Isabelle_DOF-Ontologies.CENELEC_50128"}. \<close>
|
||||
text*[ass2::assumption, long_name="Some ''assumption one''"] \<open> The subsystem Y is safe. \<close>
|
||||
text*[hyp1::hypothesis] \<open> \<open>P \<noteq> NP\<close> \<close>
|
||||
|
||||
text\<open>
|
||||
A real example fragment fsrom a larger project, declaring a text-element as a
|
||||
A real example fragment from a larger project, declaring a text-element as a
|
||||
"safety-related application condition", a concept defined in the
|
||||
@{theory "Isabelle_DOF.CENELEC_50128"} ontology:\<close>
|
||||
@{theory "Isabelle_DOF-Ontologies.CENELEC_50128"} ontology:\<close>
|
||||
|
||||
text*[hyp2::hypothesis]\<open>Under the assumption @{assumption \<open>ass2\<close>} we establish the following: ... \<close>
|
||||
|
||||
|
@ -654,11 +656,10 @@ text*[t10::test_result]
|
|||
test-execution via, \<^eg>, a makefile or specific calls to a test-environment or test-engine. \<close>
|
||||
|
||||
|
||||
text
|
||||
\<open> Finally some examples of references to doc-items, i.e. text-elements
|
||||
with declared meta-information and status. \<close>
|
||||
text \<open> As established by @{test_result (unchecked) \<open>t10\<close>},
|
||||
@{test_result (define) \<open>t10\<close>} \<close>
|
||||
text \<open> Finally some examples of references to doc-items, i.e. text-elements
|
||||
with declared meta-information and status. \<close>
|
||||
|
||||
text \<open> As established by @{test_result \<open>t10\<close>}\<close>
|
||||
text \<open> the @{test_result \<open>t10\<close>}
|
||||
as well as the @{SRAC \<open>ass122\<close>}\<close>
|
||||
text \<open> represent a justification of the safety related applicability
|
||||
|
@ -669,7 +670,6 @@ text \<open> due to notational conventions for antiquotations, one may even writ
|
|||
"represent a justification of the safety related applicability
|
||||
condition \<^SRAC>\<open>ass122\<close> aka exported constraint \<^EC>\<open>ass122\<close>."\<close>
|
||||
|
||||
|
||||
(*<*)
|
||||
end
|
||||
(*>*)
|
|
@ -1,3 +1,5 @@
|
|||
scholarly_paper
|
||||
technical_report
|
||||
CENELEC_50128
|
||||
cytology
|
||||
CC_ISO15408
|
||||
beamerx
|
|
@ -0,0 +1,2 @@
|
|||
poster
|
||||
presentation
|
|
@ -0,0 +1,8 @@
|
|||
chapter AFP
|
||||
|
||||
session "poster-example" (AFP) = "Isabelle_DOF-Ontologies" +
|
||||
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
|
||||
theories
|
||||
"poster"
|
||||
document_files
|
||||
"preamble.tex"
|
|
@ -0,0 +1,2 @@
|
|||
%% This is a placeholder for user-specific configuration and packages.
|
||||
|
|
@ -0,0 +1,39 @@
|
|||
(*<*)
|
||||
theory "poster"
|
||||
imports "Isabelle_DOF.scholarly_paper"
|
||||
"Isabelle_DOF-Ontologies.document_templates"
|
||||
begin
|
||||
|
||||
use_template "beamerposter-UNSUPPORTED"
|
||||
use_ontology "scholarly_paper"
|
||||
(*>*)
|
||||
|
||||
title*[tit::title]\<open>Example Presentation\<close>
|
||||
|
||||
author*[safouan,email="\<open>example@example.org\<close>",affiliation="\<open>Example Org\<close>"]\<open>Eliza Example\<close>
|
||||
|
||||
text\<open>
|
||||
\vfill
|
||||
\begin{block}{\large Fontsizes}
|
||||
\centering
|
||||
{\tiny tiny}\par
|
||||
{\scriptsize scriptsize}\par
|
||||
{\footnotesize footnotesize}\par
|
||||
{\normalsize normalsize}\par
|
||||
{\large large}\par
|
||||
{\Large Large}\par
|
||||
{\LARGE LARGE}\par
|
||||
{\veryHuge veryHuge}\par
|
||||
{\VeryHuge VeryHuge}\par
|
||||
{\VERYHuge VERYHuge}\par
|
||||
\end{block}
|
||||
\vfill
|
||||
\<close>
|
||||
|
||||
text\<open>
|
||||
@{block (title = "\<open>Title\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close>") "\<open>Block content\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close>"}
|
||||
\<close>
|
||||
|
||||
(*<*)
|
||||
end
|
||||
(*>*)
|
|
@ -0,0 +1,9 @@
|
|||
chapter AFP
|
||||
|
||||
session "presentation-example" (AFP) = "Isabelle_DOF-Ontologies" +
|
||||
options [document = pdf, document_output = "output", document_build = dof, timeout = 300]
|
||||
theories
|
||||
"presentation"
|
||||
document_files
|
||||
"preamble.tex"
|
||||
"figures/A.png"
|
Before Width: | Height: | Size: 12 KiB After Width: | Height: | Size: 12 KiB |
|
@ -0,0 +1,2 @@
|
|||
%% This is a placeholder for user-specific configuration and packages.
|
||||
|
|
@ -0,0 +1,69 @@
|
|||
(*<*)
|
||||
theory "presentation"
|
||||
imports "Isabelle_DOF.scholarly_paper"
|
||||
"Isabelle_DOF-Ontologies.document_templates"
|
||||
begin
|
||||
|
||||
use_template "beamer-UNSUPPORTED"
|
||||
use_ontology "scholarly_paper"
|
||||
(*>*)
|
||||
|
||||
title*[tit::title]\<open>Example Presentation\<close>
|
||||
|
||||
author*[safouan,email="\<open>example@example.org\<close>",affiliation="\<open>Example Org\<close>"]\<open>Eliza Example\<close>
|
||||
|
||||
text\<open>
|
||||
\begin{frame}
|
||||
\frametitle{Example Slide}
|
||||
\centering\huge This is an example!
|
||||
\end{frame}
|
||||
\<close>
|
||||
|
||||
|
||||
frame*[test_frame
|
||||
, frametitle = \<open>\<open>\<open>Example Slide\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close> with items @{thm "HOL.refl"}\<close>\<close>
|
||||
, framesubtitle = "''Subtitle''"]
|
||||
\<open>This is an example!
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> and the term encoding the title of this frame is \<^term_>\<open>frametitle @{frame \<open>test_frame\<close>}\<close>\<close>
|
||||
|
||||
frame*[test_frame2
|
||||
, frametitle = "''Example Slide''"
|
||||
, framesubtitle = \<open>\<open>\<open>Subtitle\<^sub>t\<^sub>e\<^sub>s\<^sub>t:\<close> the value of \<^term>\<open>(3::int) + 3\<close> is @{value "(3::int) + 3"}\<close>\<close>]
|
||||
\<open>Test frame env \<^term>\<open>refl\<close>\<close>
|
||||
|
||||
frame*[test_frame3, frametitle = "''A slide with a Figure''"]
|
||||
\<open>A figure
|
||||
@{figure_content (width=45, caption=\<open>\<open>Figure\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close> is not the \<^term>\<open>refl\<close> theorem (@{thm "refl"}).\<close>)
|
||||
"figures/A.png"}\<close>
|
||||
|
||||
frame*[test_frame4
|
||||
, options = "''allowframebreaks''"
|
||||
, frametitle = "''Example Slide with frame break''"
|
||||
, framesubtitle = \<open>\<open>\<open>Subtitle\<^sub>t\<^sub>e\<^sub>s\<^sub>t:\<close> the value of \<^term>\<open>(3::int) + 3\<close> is @{value "(3::int) + 3"}\<close>\<close>]
|
||||
\<open>
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> and the term encoding the title of this frame is \<^term_>\<open>frametitle @{frame \<open>test_frame4\<close>}\<close>
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<^item> The term \<^term>\<open>refl\<close> is...
|
||||
\<close>
|
||||
|
||||
(*<*)
|
||||
end
|
||||
(*>*)
|
|
@ -68,7 +68,7 @@ onto_class procaryotic_cells = cell +
|
|||
|
||||
onto_class eucaryotic_cells = cell +
|
||||
organelles :: "organelles' list"
|
||||
invariant has_nucleus :: "\<lambda>\<sigma>::eucaryotic_cells. \<exists> org \<in> set (organelles \<sigma>). is\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s org"
|
||||
invariant has_nucleus :: "\<exists> org \<in> set (organelles \<sigma>). is\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s org"
|
||||
\<comment> \<open>Cells must have at least one nucleus. However, this should be executable.\<close>
|
||||
|
||||
find_theorems (70)name:"eucaryotic_cells"
|
||||
|
@ -78,13 +78,10 @@ value "is\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s (mk\<^sub>n\<^
|
|||
|
||||
term \<open>eucaryotic_cells.organelles\<close>
|
||||
|
||||
value \<open>(eucaryotic_cells.organelles(eucaryotic_cells.make X Y Z Z Z [] 3 []))\<close>
|
||||
|
||||
value \<open>has_nucleus_inv(eucaryotic_cells.make X Y Z Z Z [] 3 [])\<close>
|
||||
|
||||
value \<open>has_nucleus_inv(eucaryotic_cells.make X Y Z Z Z [] 3
|
||||
[upcast\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s (nucleus.make a b c d [])])\<close>
|
||||
value \<open>(eucaryotic_cells.organelles(eucaryotic_cells.make X Y Z Z Z [] []))\<close>
|
||||
|
||||
value \<open>has_nucleus_inv(eucaryotic_cells.make X Y Z Z Z [] [])\<close>
|
||||
|
||||
value \<open>has_nucleus_inv(eucaryotic_cells.make X Y Z Z Z [] [upcast\<^sub>n\<^sub>u\<^sub>c\<^sub>l\<^sub>e\<^sub>u\<^sub>s (nucleus.make a b c )])\<close>
|
||||
|
||||
end
|
|
@ -0,0 +1,4 @@
|
|||
session "Cytology" = "Isabelle_DOF" +
|
||||
options [document = false]
|
||||
theories
|
||||
"Cytology"
|
|
@ -1,2 +1 @@
|
|||
Isabelle_DOF-Manual
|
||||
TR_my_commented_isabelle
|
|
@ -1,7 +1,5 @@
|
|||
session "TR_MyCommentedIsabelle" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof,
|
||||
dof_ontologies = "Isabelle_DOF.technical_report", dof_template = Isabelle_DOF.scrreprt,
|
||||
quick_and_dirty = true]
|
||||
options [document = pdf, document_output = "output", document_build = dof]
|
||||
theories
|
||||
"TR_MyCommentedIsabelle"
|
||||
document_files
|
|
@ -14,9 +14,11 @@
|
|||
(*<*)
|
||||
theory TR_MyCommentedIsabelle
|
||||
imports "Isabelle_DOF.technical_report"
|
||||
|
||||
begin
|
||||
|
||||
use_template "scrreprt"
|
||||
use_ontology "technical_report"
|
||||
|
||||
define_shortcut* isabelle \<rightleftharpoons> \<open>Isabelle/HOL\<close>
|
||||
|
||||
open_monitor*[this::report]
|
||||
|
@ -62,7 +64,7 @@ text\<open> \<^vs>\<open>-0.5cm\<close>
|
|||
Isabelle and Isabelle/HOL, a complementary text to the unfortunately somewhat outdated
|
||||
"The Isabelle Cookbook" in \<^url>\<open>https://nms.kcl.ac.uk/christian.urban/Cookbook/\<close>.
|
||||
The present text is also complementary to the current version of
|
||||
\<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>
|
||||
\<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>
|
||||
"The Isabelle/Isar Implementation" by Makarius Wenzel in that it focusses on subjects
|
||||
not covered there, or presents alternative explanations for which I believe, based on my
|
||||
experiences with students and Phds, that they are helpful.
|
||||
|
@ -77,7 +79,7 @@ text\<open> \<^vs>\<open>-0.5cm\<close>
|
|||
maximum of formal content which makes this text re-checkable at each load and easier
|
||||
maintainable. \<close>
|
||||
|
||||
figure*[architecture::figure,relative_width="70",src="''figures/isabelle-architecture''"]\<open>
|
||||
figure*[architecture::figure,relative_width="70",file_src="''figures/isabelle-architecture.pdf''"]\<open>
|
||||
The system architecture of Isabelle (left-hand side) and the asynchronous communication
|
||||
between the Isabelle system and the IDE (right-hand side). \<close>
|
||||
|
||||
|
@ -146,7 +148,7 @@ text\<open> \<open>*\<open>This is a text.\<close>\<close>\<close>
|
|||
|
||||
text\<open>and displayed in the Isabelle/jEdit front-end at the sceen by:\<close>
|
||||
|
||||
figure*[fig2::figure, relative_width="60", placement="pl_h", src="''figures/text-element''"]
|
||||
figure*[fig2::figure, relative_width="60", file_src="''figures/text-element.pdf''"]
|
||||
\<open>A text-element as presented in Isabelle/jEdit.\<close>
|
||||
|
||||
text\<open>The text-commands, ML-commands (and in principle any other command) can be seen as
|
||||
|
@ -345,7 +347,7 @@ text\<open>
|
|||
\<^item> \<^ML>\<open>Context.proper_subthy : theory * theory -> bool\<close> subcontext test
|
||||
\<^item> \<^ML>\<open>Context.Proof: Proof.context -> Context.generic \<close> A constructor embedding local contexts
|
||||
\<^item> \<^ML>\<open>Context.proof_of : Context.generic -> Proof.context\<close> the inverse
|
||||
\<^item> \<^ML>\<open>Context.theory_name : theory -> string\<close>
|
||||
\<^item> \<^ML>\<open>Context.theory_name : {long:bool} -> theory -> string\<close>
|
||||
\<^item> \<^ML>\<open>Context.map_theory: (theory -> theory) -> Context.generic -> Context.generic\<close>
|
||||
\<close>
|
||||
|
||||
|
@ -356,7 +358,7 @@ text\<open>The structure \<^ML_structure>\<open>Proof_Context\<close> provides a
|
|||
\<^item> \<^ML>\<open> Context.Proof: Proof.context -> Context.generic \<close>
|
||||
the path to a generic Context, i.e. a sum-type of global and local contexts
|
||||
in order to simplify system interfaces
|
||||
\<^item> \<^ML>\<open> Proof_Context.get_global: theory -> string -> Proof.context\<close>
|
||||
\<^item> \<^ML>\<open> Proof_Context.get_global: {long:bool} -> theory -> string -> Proof.context\<close>
|
||||
\<close>
|
||||
|
||||
|
||||
|
@ -364,7 +366,7 @@ subsection*[t213::example]\<open>Mechanism 2 : Extending the Global Context \<op
|
|||
|
||||
text\<open>A central mechanism for constructing user-defined data is by the \<^ML_functor>\<open>Generic_Data\<close>-functor.
|
||||
A plugin needing some data \<^verbatim>\<open>T\<close> and providing it with implementations for an
|
||||
\<^verbatim>\<open>empty\<close>, and operations \<^verbatim>\<open>merge\<close> and \<^verbatim>\<open>extend\<close>, can construct a lense with operations
|
||||
\<^verbatim>\<open>empty\<close>, and operation \<^verbatim>\<open>merge\<close>, can construct a lense with operations
|
||||
\<^verbatim>\<open>get\<close> and \<^verbatim>\<open>put\<close> that attach this data into the generic system context. Rather than using
|
||||
unsynchronized SML mutable variables, this is the mechanism to introduce component local
|
||||
data in Isabelle, which allows to manage this data for the necessary backtrack and synchronization
|
||||
|
@ -373,14 +375,12 @@ text\<open>A central mechanism for constructing user-defined data is by the \<^M
|
|||
ML \<open>
|
||||
datatype X = mt
|
||||
val init = mt;
|
||||
val ext = I
|
||||
fun merge (X,Y) = mt
|
||||
|
||||
structure Data = Generic_Data
|
||||
(
|
||||
type T = X
|
||||
val empty = init
|
||||
val extend = ext
|
||||
val merge = merge
|
||||
);
|
||||
\<close>
|
||||
|
@ -807,18 +807,13 @@ text\<open> They reflect the Pure logic depicted in a number of presentations s
|
|||
Notated as logical inference rules, these operations were presented as follows:
|
||||
\<close>
|
||||
|
||||
side_by_side_figure*["text-elements"::side_by_side_figure,anchor="''fig-kernel1''",
|
||||
caption="''Pure Kernel Inference Rules I ''",relative_width="48",
|
||||
src="''figures/pure-inferences-I''",anchor2="''fig-kernel2''",
|
||||
caption2="''Pure Kernel Inference Rules II''",relative_width2="47",
|
||||
src2="''figures/pure-inferences-II''"]\<open> \<close>
|
||||
text*["text_elements"::float,
|
||||
main_caption="\<open>Kernel Inference Rules.\<close>"]
|
||||
\<open>
|
||||
@{fig_content (width=48, caption="Pure Kernel Inference Rules I.") "figures/pure-inferences-I.pdf"
|
||||
}\<^hfill>@{fig_content (width=47, caption="Pure Kernel Inference Rules II.") "figures/pure-inferences-II.pdf"}
|
||||
\<close>
|
||||
|
||||
(*
|
||||
figure*[kir1::figure,relative_width="100",src="''figures/pure-inferences-I''"]
|
||||
\<open> Pure Kernel Inference Rules I.\<close>
|
||||
figure*[kir2::figure,relative_width="100",src="''figures/pure-inferences-II''"]
|
||||
\<open> Pure Kernel Inference Rules II. \<close>
|
||||
*)
|
||||
|
||||
text\<open>Note that the transfer rule:
|
||||
\[
|
||||
|
@ -891,7 +886,6 @@ datatype thy = Thy of
|
|||
\<^item> \<^ML>\<open>Theory.axiom_space: theory -> Name_Space.T\<close>
|
||||
\<^item> \<^ML>\<open>Theory.all_axioms_of: theory -> (string * term) list\<close>
|
||||
\<^item> \<^ML>\<open>Theory.defs_of: theory -> Defs.T\<close>
|
||||
\<^item> \<^ML>\<open>Theory.join_theory: theory list -> theory\<close>
|
||||
\<^item> \<^ML>\<open>Theory.at_begin: (theory -> theory option) -> theory -> theory\<close>
|
||||
\<^item> \<^ML>\<open>Theory.at_end: (theory -> theory option) -> theory -> theory\<close>
|
||||
\<^item> \<^ML>\<open>Theory.begin_theory: string * Position.T -> theory list -> theory\<close>
|
||||
|
@ -1144,8 +1138,7 @@ text\<open>
|
|||
necessary infrastructure --- i.e. the operations to pack and unpack theories and
|
||||
queries on it:
|
||||
|
||||
\<^item> \<^ML>\<open> Toplevel.theory_toplevel: theory -> Toplevel.state\<close>
|
||||
\<^item> \<^ML>\<open> Toplevel.init_toplevel: unit -> Toplevel.state\<close>
|
||||
\<^item> \<^ML>\<open> Toplevel.make_state: theory option -> Toplevel.state\<close>
|
||||
\<^item> \<^ML>\<open> Toplevel.is_toplevel: Toplevel.state -> bool\<close>
|
||||
\<^item> \<^ML>\<open> Toplevel.is_theory: Toplevel.state -> bool\<close>
|
||||
\<^item> \<^ML>\<open> Toplevel.is_proof: Toplevel.state -> bool\<close>
|
||||
|
@ -1183,7 +1176,7 @@ text\<open> The extensibility of Isabelle as a system framework depends on a num
|
|||
\<^item> \<^ML>\<open>Toplevel.theory: (theory -> theory) -> Toplevel.transition -> Toplevel.transition\<close>
|
||||
adjoins a theory transformer.
|
||||
\<^item> \<^ML>\<open>Toplevel.generic_theory: (generic_theory -> generic_theory) -> Toplevel.transition -> Toplevel.transition\<close>
|
||||
\<^item> \<^ML>\<open>Toplevel.theory': (bool -> theory -> theory) -> Toplevel.presentation -> Toplevel.transition -> Toplevel.transition\<close>
|
||||
\<^item> \<^ML>\<open>Toplevel.theory': (bool -> theory -> theory) -> Toplevel.presentation option -> Toplevel.transition -> Toplevel.transition\<close>
|
||||
\<^item> \<^ML>\<open>Toplevel.exit: Toplevel.transition -> Toplevel.transition\<close>
|
||||
\<^item> \<^ML>\<open>Toplevel.ignored: Position.T -> Toplevel.transition\<close>
|
||||
\<^item> \<^ML>\<open>Toplevel.present_local_theory: (xstring * Position.T) option ->
|
||||
|
@ -1201,7 +1194,6 @@ text\<open>
|
|||
\<^item> \<^ML>\<open>Document.state : unit -> Document.state\<close>, giving the state as a "collection" of named
|
||||
nodes, each consisting of an editable list of commands, associated with asynchronous
|
||||
execution process,
|
||||
\<^item> \<^ML>\<open>Session.get_keywords : unit -> Keyword.keywords\<close>, this looks to be session global,
|
||||
\<^item> \<^ML>\<open>Thy_Header.get_keywords : theory -> Keyword.keywords\<close> this looks to be just theory global.
|
||||
|
||||
|
||||
|
@ -1275,7 +1267,6 @@ subsection\<open>Miscellaneous\<close>
|
|||
|
||||
text\<open>Here are a few queries relevant for the global config of the isar engine:\<close>
|
||||
ML\<open> Document.state();\<close>
|
||||
ML\<open> Session.get_keywords(); (* this looks to be session global. *) \<close>
|
||||
ML\<open> Thy_Header.get_keywords @{theory};(* this looks to be really theory global. *) \<close>
|
||||
|
||||
|
||||
|
@ -1440,7 +1431,7 @@ text\<open>The document model forsees a number of text files, which are organize
|
|||
secondary formats can be \<^verbatim>\<open>.sty\<close>,\<^verbatim>\<open>.tex\<close>, \<^verbatim>\<open>.png\<close>, \<^verbatim>\<open>.pdf\<close>, or other files processed
|
||||
by Isabelle and listed in a configuration processed by the build system.\<close>
|
||||
|
||||
figure*[fig3::figure, relative_width="100",src="''figures/document-model''"]
|
||||
figure*[fig3::figure, relative_width="100",file_src="''figures/document-model.pdf''"]
|
||||
\<open>A Theory-Graph in the Document Model\<close>
|
||||
|
||||
text\<open>A \<^verbatim>\<open>.thy\<close> file consists of a \<^emph>\<open>header\<close>, a \<^emph>\<open>context-definition\<close> and
|
||||
|
@ -1535,7 +1526,7 @@ text\<open> ... uses the antiquotation @{ML "@{here}"} to infer from the system
|
|||
of itself in the global document, converts it to markup (a string-representation of it) and sends
|
||||
it via the usual @{ML "writeln"} to the interface. \<close>
|
||||
|
||||
figure*[hyplinkout::figure,relative_width="40",src="''figures/markup-demo''"]
|
||||
figure*[hyplinkout::figure,relative_width="40",file_src="''figures/markup-demo.png''"]
|
||||
\<open>Output with hyperlinked position.\<close>
|
||||
|
||||
text\<open>@{figure \<open>hyplinkout\<close>} shows the produced output where the little house-like symbol in the
|
||||
|
@ -1637,7 +1628,7 @@ val data = \<comment> \<open>Derived from Yakoub's example ;-)\<close>
|
|||
, (\<open>Frédéric II\<close>, \<open>King of Sicily\<close>)
|
||||
, (\<open>Frédéric III\<close>, \<open>the Handsome\<close>)
|
||||
, (\<open>Frédéric IV\<close>, \<open>of the Empty Pockets\<close>)
|
||||
, (\<open>Frédéric V\<close>, \<open>King of Denmark–Norway\<close>)
|
||||
, (\<open>Frédéric V\<close>, \<open>King of Denmark-Norway\<close>)
|
||||
, (\<open>Frédéric VI\<close>, \<open>the Knight\<close>)
|
||||
, (\<open>Frédéric VII\<close>, \<open>Count of Toggenburg\<close>)
|
||||
, (\<open>Frédéric VIII\<close>, \<open>Count of Zollern\<close>)
|
||||
|
@ -1882,18 +1873,17 @@ Common Stuff related to Inner Syntax Parsing
|
|||
\<^item>\<^ML>\<open>Args.internal_typ : typ parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.internal_term: term parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.internal_fact: thm list parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.internal_attribute: (morphism -> attribute) parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.internal_declaration: declaration parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.internal_attribute: attribute Morphism.entity parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.alt_name : string parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.liberal_name: string parser\<close>
|
||||
|
||||
|
||||
|
||||
Common Isar Syntax
|
||||
\<^item>\<^ML>\<open>Args.named_source: (Token.T -> Token.src) -> Token.src parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.named_typ : (string -> typ) -> typ parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.named_term : (string -> term) -> term parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.embedded_declaration: (Input.source -> declaration) -> declaration parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.embedded_declaration: (Input.source -> Morphism.declaration_entity) ->
|
||||
Morphism.declaration_entity parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.typ_abbrev : typ context_parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.typ: typ context_parser\<close>
|
||||
\<^item>\<^ML>\<open>Args.term: term context_parser\<close>
|
||||
|
@ -2153,7 +2143,7 @@ text\<open>
|
|||
\<^item>\<^ML>\<open>Document_Output.output_document: Proof.context -> {markdown: bool} -> Input.source -> Latex.text \<close>
|
||||
\<^item>\<^ML>\<open>Document_Output.output_token: Proof.context -> Token.T -> Latex.text \<close>
|
||||
\<^item>\<^ML>\<open>Document_Output.output_source: Proof.context -> string -> Latex.text \<close>
|
||||
\<^item>\<^ML>\<open>Document_Output.present_thy: Options.T -> theory -> Document_Output.segment list -> Latex.text \<close>
|
||||
\<^item>\<^ML>\<open>Document_Output.present_thy: Options.T -> Keyword.keywords -> string -> Document_Output.segment list -> Latex.text \<close>
|
||||
|
||||
\<^item>\<^ML>\<open>Document_Output.isabelle: Proof.context -> Latex.text -> Latex.text\<close>
|
||||
\<^item>\<^ML>\<open>Document_Output.isabelle_typewriter: Proof.context -> Latex.text -> Latex.text\<close>
|
Before Width: | Height: | Size: 162 KiB After Width: | Height: | Size: 162 KiB |
Before Width: | Height: | Size: 13 KiB After Width: | Height: | Size: 13 KiB |
Before Width: | Height: | Size: 91 KiB After Width: | Height: | Size: 91 KiB |
Before Width: | Height: | Size: 31 KiB After Width: | Height: | Size: 31 KiB |
|
@ -10,15 +10,20 @@
|
|||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*************************************************************************)
|
||||
|
||||
chapter\<open>Common Criteria Definitions\<close>
|
||||
chapter\<open>Common Criteria\<close>
|
||||
section\<open>Terminology\<close>
|
||||
|
||||
(*<<*)
|
||||
theory CC_terminology
|
||||
|
||||
imports "Isabelle_DOF.technical_report"
|
||||
imports
|
||||
"Isabelle_DOF.technical_report"
|
||||
|
||||
begin
|
||||
|
||||
define_ontology "DOF-CC_terminology.sty" "CC"
|
||||
|
||||
(*>>*)
|
||||
text\<open>We re-use the class @\<open>typ math_content\<close>, which provides also a framework for
|
||||
semi-formal terminology, which we re-use by this definition.\<close>
|
||||
|
||||
|
@ -35,20 +40,19 @@ type_synonym concept = concept_definition
|
|||
declare[[ Definition_default_class="concept_definition"]]
|
||||
|
||||
|
||||
(*>>*)
|
||||
|
||||
section \<open>Terminology\<close>
|
||||
subsection \<open>Terminology\<close>
|
||||
|
||||
|
||||
subsection \<open>Terms and definitions common in the CC\<close>
|
||||
subsubsection \<open>Terms and definitions common in the CC\<close>
|
||||
|
||||
Definition* [aas_def, tag= "''adverse actions''"]
|
||||
\<open>actions performed by a threat agent on an asset\<close>
|
||||
|
||||
declare_reference*[toe_def]
|
||||
declare_reference*[toeDef]
|
||||
|
||||
Definition* [assts_def, tag="''assets''"]
|
||||
\<open>entities that the owner of the @{docitem toe_def} presumably places value upon \<close>
|
||||
\<open>entities that the owner of the @{docitem (unchecked) toeDef} presumably places value upon \<close>
|
||||
|
||||
Definition* [asgn_def, tag="''assignment''"]
|
||||
\<open>the specification of an identified parameter in a component (of the CC) or requirement.\<close>
|
||||
|
@ -56,7 +60,8 @@ Definition* [asgn_def, tag="''assignment''"]
|
|||
declare_reference*[sfrs_def]
|
||||
|
||||
Definition* [assrc_def, tag="''assurance''"]
|
||||
\<open>grounds for confidence that a @{docitem toe_def} meets the @{docitem sfrs_def}\<close>
|
||||
\<open>grounds for confidence that a @{docitem (unchecked) toeDef}
|
||||
meets the @{docitem (unchecked) sfrs_def}\<close>
|
||||
|
||||
Definition* [attptl_def, tag="''attack potential''"]
|
||||
\<open>measure of the effort to be expended in attacking a TOE, expressed in terms of
|
||||
|
@ -69,9 +74,10 @@ Definition* [authdata_def, tag="''authentication data''"]
|
|||
\<open>information used to verify the claimed identity of a user\<close>
|
||||
|
||||
Definition* [authusr_def, tag = "''authorised user''"]
|
||||
\<open>@{docitem toe_def} user who may, in accordance with the @{docitem sfrs_def}, perform an operation\<close>
|
||||
\<open>@{docitem (unchecked) toeDef} user who may,
|
||||
in accordance with the @{docitem (unchecked) sfrs_def}, perform an operation\<close>
|
||||
|
||||
Definition* [bpp_def, tag="''Base Protection Profile''"]
|
||||
Definition* [bppDef, tag="''Base Protection Profile''"]
|
||||
\<open>Protection Profile used as a basis to build a Protection Profile Configuration\<close>
|
||||
|
||||
Definition* [cls_def,tag="''class''"]
|
||||
|
@ -104,8 +110,8 @@ Definition* [cfrm_def,tag="''confirm''"]
|
|||
term is only applied to evaluator actions.\<close>
|
||||
|
||||
Definition* [cnnctvty_def, tag="''connectivity''"]
|
||||
\<open>property of the @{docitem toe_def} allowing interaction with IT entities external to the
|
||||
@{docitem toe_def}
|
||||
\<open>property of the @{docitem (unchecked) toeDef} allowing interaction with IT entities external to the
|
||||
@{docitem (unchecked) toeDef}
|
||||
|
||||
This includes exchange of data by wire or by wireless means, over any
|
||||
distance in any environment or configuration.\<close>
|
||||
|
@ -118,17 +124,20 @@ Definition* [cnt_vrb_def, tag="''counter, verb''"]
|
|||
\<open>meet an attack where the impact of a particular threat is mitigated
|
||||
but not necessarily eradicated\<close>
|
||||
|
||||
declare_reference*[st_def]
|
||||
declare_reference*[pp_def]
|
||||
declare_reference*[stDef]
|
||||
declare_reference*[ppDef]
|
||||
|
||||
Definition* [dmnst_conf_def, tag="''demonstrable conformance''"]
|
||||
\<open>relation between an @{docitem st_def} and a @{docitem pp_def}, where the @{docitem st_def}
|
||||
\<open>relation between an @{docitem (unchecked) stDef} and a @{docitem (unchecked) ppDef},
|
||||
where the @{docitem (unchecked) stDef}
|
||||
provides a solution which solves the generic security problem in the PP
|
||||
|
||||
The @{docitem pp_def} and the @{docitem st_def} may contain entirely different statements that discuss
|
||||
The @{docitem (unchecked) ppDef} and the @{docitem (unchecked) stDef} may contain
|
||||
entirely different statements that discuss
|
||||
different entities, use different concepts etc. Demonstrable conformance is
|
||||
also suitable for a @{docitem toe_def} type where several similar @{docitem pp_def}s already exist, thus
|
||||
allowing the ST author to claim conformance to these @{docitem pp_def}s simultaneously,
|
||||
also suitable for a @{docitem (unchecked) toeDef} type
|
||||
where several similar @{docitem (unchecked) ppDef}s already exist, thus
|
||||
allowing the ST author to claim conformance to these @{docitem (unchecked) ppDef}s simultaneously,
|
||||
thereby saving work.\<close>
|
||||
|
||||
Definition* [dmstrt_def, tag="''demonstrate''"]
|
||||
|
@ -136,9 +145,10 @@ Definition* [dmstrt_def, tag="''demonstrate''"]
|
|||
|
||||
Definition* [dpndcy, tag="''dependency''"]
|
||||
\<open>relationship between components such that if a requirement based on the depending
|
||||
component is included in a @{docitem pp_def}, ST or package, a requirement based on
|
||||
the component that is depended upon must normally also be included in the @{docitem pp_def},
|
||||
@{docitem st_def} or package\<close>
|
||||
component is included in a @{docitem (unchecked) ppDef}, ST or package, a requirement based on
|
||||
the component that is depended upon must normally also be included
|
||||
in the @{docitem (unchecked) ppDef},
|
||||
@{docitem (unchecked) stDef} or package\<close>
|
||||
|
||||
Definition* [dscrb_def, tag="''describe''"]
|
||||
\<open>provide specific details of an entity\<close>
|
||||
|
@ -153,7 +163,7 @@ Definition* [dtrmn_def, tag="''determine''"]
|
|||
performed which needs to be reviewed\<close>
|
||||
|
||||
Definition* [devenv_def, tag="''development environment''"]
|
||||
\<open>environment in which the @{docitem toe_def} is developed\<close>
|
||||
\<open>environment in which the @{docitem (unchecked) toeDef} is developed\<close>
|
||||
|
||||
Definition* [elmnt_def, tag="''element''"]
|
||||
\<open>indivisible statement of a security need\<close>
|
||||
|
@ -165,26 +175,27 @@ Definition* [ensr_def, tag="''ensure''"]
|
|||
consequence is not fully certain, on the basis of that action alone.\<close>
|
||||
|
||||
Definition* [eval_def, tag="''evaluation''"]
|
||||
\<open>assessment of a @{docitem pp_def}, an @{docitem st_def} or a @{docitem toe_def},
|
||||
against defined criteria.\<close>
|
||||
\<open>assessment of a @{docitem (unchecked) ppDef}, an @{docitem (unchecked) stDef}
|
||||
or a @{docitem (unchecked) toeDef}, against defined criteria.\<close>
|
||||
|
||||
Definition* [eal_def, tag= "''evaluation assurance level''"]
|
||||
\<open>set of assurance requirements drawn from CC Part 3, representing a point on the
|
||||
CC predefined assurance scale, that form an assurance package\<close>
|
||||
|
||||
Definition* [eval_auth_def, tag="''evaluation authority''"]
|
||||
\<open>body that sets the standards and monitors the quality of evaluations conducted by bodies within a specific community and
|
||||
implements the CC for that community by means of an evaluation scheme\<close>
|
||||
\<open>body that sets the standards and monitors the quality of evaluations conducted
|
||||
by bodies within a specific community and implements the CC for that community
|
||||
by means of an evaluation scheme\<close>
|
||||
|
||||
Definition* [eval_schm_def, tag="''evaluation scheme''"]
|
||||
\<open>administrative and regulatory framework under which the CC is applied by an
|
||||
evaluation authority within a specific community\<close>
|
||||
|
||||
Definition* [exst_def, tag="''exhaustive''"]
|
||||
Definition* [exstDef, tag="''exhaustive''"]
|
||||
\<open>characteristic of a methodical approach taken to perform an
|
||||
analysis or activity according to an unambiguous plan
|
||||
This term is used in the CC with respect to conducting an analysis or other
|
||||
activity. It is related to “systematic” but is considerably stronger, in that it
|
||||
activity. It is related to ``systematic'' but is considerably stronger, in that it
|
||||
indicates not only that a methodical approach has been taken to perform the
|
||||
analysis or activity according to an unambiguous plan, but that the plan that
|
||||
was followed is sufficient to ensure that all possible avenues have been
|
||||
|
@ -235,7 +246,7 @@ Definition* [intl_com_chan_def, tag ="''internal communication channel''"]
|
|||
Definition* [int_toe_trans, tag="''internal TOE transfer''"]
|
||||
\<open>communicating data between separated parts of the TOE\<close>
|
||||
|
||||
Definition* [inter_consist_def, tag="''internally consistent''"]
|
||||
Definition* [inter_consistDef, tag="''internally consistent''"]
|
||||
\<open>no apparent contradictions exist between any aspects of an entity
|
||||
|
||||
In terms of documentation, this means that there can be no statements within
|
||||
|
@ -270,7 +281,7 @@ Definition* [org_sec_po_def, tag="''organisational security policy''"]
|
|||
|
||||
Definition* [pckg_def, tag="''package''"]
|
||||
\<open>named set of either security functional or security assurance requirements
|
||||
An example of a package is “EAL 3”.\<close>
|
||||
An example of a package is ``EAL 3''.\<close>
|
||||
|
||||
Definition* [pp_config_def, tag="''Protection Profile Configuration''"]
|
||||
\<open>Protection Profile composed of Base Protection Profiles and Protection Profile Module\<close>
|
||||
|
@ -278,7 +289,7 @@ Definition* [pp_config_def, tag="''Protection Profile Configuration''"]
|
|||
Definition* [pp_eval_def, tag="''Protection Profile evaluation''"]
|
||||
\<open> assessment of a PP against defined criteria \<close>
|
||||
|
||||
Definition* [pp_def, tag="''Protection Profile''"]
|
||||
Definition* [ppDef, tag="''Protection Profile''"]
|
||||
\<open>implementation-independent statement of security needs for a TOE type\<close>
|
||||
|
||||
Definition* [ppm_def, tag="''Protection Profile Module''"]
|
||||
|
@ -290,36 +301,37 @@ declare_reference*[tsf_def]
|
|||
Definition* [prv_def, tag="''prove''"]
|
||||
\<open>show correspondence by formal analysis in its mathematical sense
|
||||
It is completely rigorous in all ways. Typically, “prove” is used when there is
|
||||
a desire to show correspondence between two @{docitem tsf_def} representations at a high
|
||||
level of rigour.\<close>
|
||||
a desire to show correspondence between two @{docitem (unchecked) tsf_def}
|
||||
representations at a high level of rigour.\<close>
|
||||
|
||||
Definition* [ref_def, tag="''refinement''"]
|
||||
\<open>addition of details to a component\<close>
|
||||
|
||||
Definition* [role_def, tag="''role''"]
|
||||
\<open>predefined set of rules establishing the allowed interactions between
|
||||
a user and the @{docitem toe_def}\<close>
|
||||
a user and the @{docitem (unchecked) toeDef}\<close>
|
||||
|
||||
declare_reference*[sfp_def]
|
||||
|
||||
Definition* [scrt_def, tag="''secret''"]
|
||||
\<open>information that must be known only to authorised users and/or the
|
||||
@{docitem tsf_def} in order to enforce a specific @{docitem sfp_def}\<close>
|
||||
@{docitem (unchecked) tsf_def} in order to enforce a specific @{docitem (unchecked) sfp_def}\<close>
|
||||
|
||||
declare_reference*[sfr_def]
|
||||
|
||||
Definition* [sec_st_def, tag="''secure state''"]
|
||||
\<open>state in which the @{docitem tsf_def} data are consistent and the @{docitem tsf_def}
|
||||
continues correct enforcement of the @{docitem sfr_def}s\<close>
|
||||
Definition* [sec_stDef, tag="''secure state''"]
|
||||
\<open>state in which the @{docitem (unchecked) tsf_def} data are consistent
|
||||
and the @{docitem (unchecked) tsf_def}
|
||||
continues correct enforcement of the @{docitem (unchecked) sfr_def}s\<close>
|
||||
|
||||
Definition* [sec_att_def, tag="''security attribute''"]
|
||||
\<open>property of subjects, users (including external IT products), objects,
|
||||
information, sessions and/or resources that is used in defining the @{docitem sfr_def}s
|
||||
and whose values are used in enforcing the @{docitem sfr_def}s\<close>
|
||||
information, sessions and/or resources that is used in defining the @{docitem (unchecked) sfr_def}s
|
||||
and whose values are used in enforcing the @{docitem (unchecked) sfr_def}s\<close>
|
||||
|
||||
Definition* [sec_def, tag="''security''"]
|
||||
\<open>function policy set of rules describing specific security behaviour enforced
|
||||
by the @{docitem tsf_def} and expressible as a set of @{docitem sfr_def}s\<close>
|
||||
by the @{docitem (unchecked) tsf_def} and expressible as a set of @{docitem (unchecked) sfr_def}s\<close>
|
||||
|
||||
Definition* [sec_obj_def, tag="''security objective''"]
|
||||
\<open>statement of an intent to counter identified threats and/or satisfy identified
|
||||
|
@ -328,18 +340,21 @@ Definition* [sec_obj_def, tag="''security objective''"]
|
|||
Definition* [sec_prob_def, tag ="''security problem''"]
|
||||
\<open>statement which in a formal manner defines the nature and scope of the security that
|
||||
the TOE is intended to address This statement consists of a combination of:
|
||||
\begin{itemize}
|
||||
\item threats to be countered by the TOE and its operational environment,
|
||||
\item the OSPs enforced by the TOE and its operational environment, and
|
||||
\item the assumptions that are upheld for the operational environment of the TOE.
|
||||
\end{itemize}\<close>
|
||||
\begin{itemize}
|
||||
\item threats to be countered by the TOE and its operational environment,
|
||||
\item the OSPs enforced by the TOE and its operational environment, and
|
||||
\item the assumptions that are upheld for the operational environment of the TOE.
|
||||
\end{itemize}\<close>
|
||||
|
||||
Definition* [sr_def, tag="''security requirement''", short_tag="Some(''SR'')"]
|
||||
\<open>requirement, stated in a standardised language, which is meant to contribute
|
||||
to achieving the security objectives for a TOE\<close>
|
||||
text \<open>@{docitem toe_def}\<close>
|
||||
Definition* [st, tag="''Security Target''", short_tag="Some(''ST'')"]
|
||||
\<open>implementation-dependent statement of security needs for a specific i\<section>dentified @{docitem toe_def}\<close>
|
||||
(*<*)
|
||||
text \<open>@{docitem (unchecked) toeDef}\<close>
|
||||
(*>*)
|
||||
Definition* [st, tag="''Security Target''", short_tag="Some(''ST'')"]
|
||||
\<open>implementation-dependent statement of security needs for a specific identified
|
||||
@{docitem (unchecked) toeDef}\<close>
|
||||
|
||||
Definition* [slct_def, tag="''selection''"]
|
||||
\<open>specification of one or more items from a list in a component\<close>
|
||||
|
@ -379,13 +394,13 @@ Definition* [toe_res_def, tag="''TOE resource''"]
|
|||
|
||||
Definition* [toe_sf_def, tag="''TOE security functionality''", short_tag= "Some(''TSF'')"]
|
||||
\<open>combined functionality of all hardware, software, and firmware of a TOE that must be relied upon
|
||||
for the correct enforcement of the @{docitem sfr_def}s\<close>
|
||||
for the correct enforcement of the @{docitem (unchecked) sfr_def}s\<close>
|
||||
|
||||
Definition* [tr_vrb_def, tag="''trace, verb''"]
|
||||
\<open>perform an informal correspondence analysis between two entities with only a
|
||||
minimal level of rigour\<close>
|
||||
|
||||
Definition* [trnsfs_out_toe_def, tag="''transfers outside of the TOE''"]
|
||||
Definition* [trnsfs_out_toeDef, tag="''transfers outside of the TOE''"]
|
||||
\<open>TSF mediated communication of data to entities not under the control of the TSF\<close>
|
||||
|
||||
Definition* [transl_def, tag= "''translation''"]
|
||||
|
@ -430,13 +445,22 @@ effort is required of the evaluator.\<close>
|
|||
|
||||
Definition* [dev_def, tag="''Developer''"]
|
||||
\<open>who respond to actual or perceived consumer security requirements in
|
||||
constructing a @{docitem toe_def}, reference this CC_Part_3
|
||||
constructing a @{docitem (unchecked) toeDef}, reference this CC\_Part\_3
|
||||
when interpreting statements of assurance requirements and determining
|
||||
assurance approaches of @{docitem toe}s.\<close>
|
||||
|
||||
Definition*[evalu_def, tag="'' Evaluator''"]
|
||||
\<open>who use the assurance requirements defined in CC_Part_3
|
||||
\<open>who use the assurance requirements defined in CC\_Part\_3
|
||||
as mandatory statement of evaluation criteria when determining the assurance
|
||||
of @{docitem toe_def}s and when evaluating @{docitem pp_def}s and @{docitem st_def}s.\<close>
|
||||
of @{docitem (unchecked) toeDef}s and when evaluating @{docitem ppDef}s
|
||||
and @{docitem (unchecked) stDef}s.\<close>
|
||||
|
||||
|
||||
Definition*[toeDef] \<open>\<close>
|
||||
Definition*[sfrs_def] \<open>\<close>
|
||||
Definition*[sfr_def] \<open>\<close>
|
||||
Definition*[stDef] \<open>\<close>
|
||||
Definition*[sfp_def] \<open>\<close>
|
||||
Definition*[tsf_def] \<open>\<close>
|
||||
|
||||
end
|
|
@ -10,16 +10,18 @@
|
|||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*************************************************************************)
|
||||
|
||||
section\<open>CC 3.1.R5\<close>
|
||||
(*<*)
|
||||
theory "CC_v3_1_R5"
|
||||
imports "Isabelle_DOF.technical_report"
|
||||
imports
|
||||
"Isabelle_DOF.technical_report"
|
||||
"CC_terminology"
|
||||
|
||||
|
||||
begin
|
||||
(*>*)
|
||||
|
||||
section \<open>General Infrastructure on CC Evaluations\<close>
|
||||
subsection \<open>General Infrastructure on CC Evaluations\<close>
|
||||
|
||||
datatype EALs = EAL1 | EAL2 | EAL3 | EAL4 | EAL5 | EAL6 | EAL7
|
||||
|
||||
|
@ -30,7 +32,7 @@ doc_class CC_structure_element =(* text_element + *)
|
|||
doc_class CC_text_element = text_element +
|
||||
eval_level :: EALs
|
||||
|
||||
section \<open>Security target ontology\<close>
|
||||
subsection \<open>Security target ontology\<close>
|
||||
|
||||
|
||||
doc_class st_ref_cls = CC_text_element +
|
||||
|
@ -53,12 +55,11 @@ doc_class toe_ovrw_cls = CC_text_element +
|
|||
firmeware_req:: "CC_text_element list" <= "[]"
|
||||
features_req :: "CC_text_element list" <= "[]"
|
||||
invariant eal_consistency::
|
||||
"\<lambda> X::toe_ovrw_cls .
|
||||
(case eval_level X of
|
||||
EAL1 \<Rightarrow> software_req X \<noteq> []
|
||||
| EAL2 \<Rightarrow> software_req X \<noteq> []
|
||||
| EAL3 \<Rightarrow> software_req X \<noteq> []
|
||||
| EAL4 \<Rightarrow> software_req X \<noteq> []
|
||||
"(case eval_level \<sigma> of
|
||||
EAL1 \<Rightarrow> software_req \<sigma> \<noteq> []
|
||||
| EAL2 \<Rightarrow> software_req \<sigma> \<noteq> []
|
||||
| EAL3 \<Rightarrow> software_req \<sigma> \<noteq> []
|
||||
| EAL4 \<Rightarrow> software_req \<sigma> \<noteq> []
|
||||
| _ \<Rightarrow> undefined)"
|
||||
|
||||
thm eal_consistency_inv_def
|
|
@ -0,0 +1,57 @@
|
|||
%% Copyright (C) University of Exeter
|
||||
%% University of Paris-Saclay
|
||||
%%
|
||||
%% License:
|
||||
%% This program can be redistributed and/or modified under the terms
|
||||
%% of the LaTeX Project Public License Distributed from CTAN
|
||||
%% archives in directory macros/latex/base/lppl.txt; either
|
||||
%% version 1.3c of the License, or (at your option) any later version.
|
||||
%% OR
|
||||
%% The 2-clause BSD-style license.
|
||||
%%
|
||||
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
|
||||
|
||||
\NeedsTeXFormat{LaTeX2e}\relax
|
||||
\ProvidesPackage{DOF-CC_terminology}
|
||||
[00/00/0000 Document-Type Support Framework for Isabelle (CC).]
|
||||
|
||||
\RequirePackage{DOF-COL}
|
||||
\usepackage{etex}
|
||||
\ifdef{\reserveinserts}{\reserveinserts{28}}{}
|
||||
|
||||
|
||||
|
||||
|
||||
\newkeycommand*{\mathcc}[label=,type=%
|
||||
, scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTshortUNDERSCOREname ={}%
|
||||
, scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc = %
|
||||
, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel =%
|
||||
, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable =%
|
||||
, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants =%
|
||||
, scholarlyUNDERSCOREpaperDOTtextUNDERSCOREsectionDOTmainUNDERSCOREauthor =%
|
||||
, scholarlyUNDERSCOREpaperDOTtextUNDERSCOREsectionDOTfixmeUNDERSCORElist =%
|
||||
, IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel =%
|
||||
, scholarlyUNDERSCOREpaperDOTtechnicalDOTdefinitionUNDERSCORElist =%
|
||||
, scholarlyUNDERSCOREpaperDOTtechnicalDOTstatus =%
|
||||
, CCUNDERSCOREterminologyDOTconceptUNDERSCOREdefinitionDOTtag=%
|
||||
, CCUNDERSCOREterminologyDOTconceptUNDERSCOREdefinitionDOTshortUNDERSCOREtag=%
|
||||
]
|
||||
[1]
|
||||
{%
|
||||
\begin{isamarkuptext}%
|
||||
\ifthenelse{\equal{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTshortUNDERSCOREname}} {} }
|
||||
{%
|
||||
\begin{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc}}\label{\commandkey{label}}
|
||||
#1
|
||||
\end{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc}}
|
||||
}{%
|
||||
\begin{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc}}[\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTshortUNDERSCOREname}]\label{\commandkey{label}}
|
||||
#1
|
||||
\end{\commandkey{scholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontentDOTmcc}}
|
||||
}
|
||||
\end{isamarkuptext}%
|
||||
}
|
||||
|
||||
\expandafter\def\csname isaDofDOTtextDOTscholarlyUNDERSCOREpaperDOTmathUNDERSCOREcontent\endcsname{\mathcc}
|
||||
|
||||
|
|
@ -24,10 +24,14 @@ identifies:
|
|||
\<close>
|
||||
|
||||
(*<<*)
|
||||
theory CENELEC_50128
|
||||
imports "Isabelle_DOF.technical_report"
|
||||
theory
|
||||
CENELEC_50128
|
||||
imports
|
||||
"Isabelle_DOF.technical_report"
|
||||
begin
|
||||
|
||||
define_ontology "DOF-CENELEC_50128.sty" "CENELEC 50128"
|
||||
|
||||
(* this is a hack and should go into an own ontology, providing thingsd like:
|
||||
- Assumption*
|
||||
- Hypothesis*
|
||||
|
@ -155,10 +159,10 @@ which have the required safety integrity level.\<close>
|
|||
Definition*[entity]
|
||||
\<open>person, group or organisation who fulfils a role as defined in this European Standard.\<close>
|
||||
|
||||
declare_reference*[fault]
|
||||
declare_reference*[fault::cenelec_term]
|
||||
Definition*[error]
|
||||
\<open>defect, mistake or inaccuracy which could result in failure or in a deviation
|
||||
from the intended performance or behaviour (cf. @{cenelec_term (unchecked) \<open>fault\<close>})).\<close>
|
||||
from the intended performance or behaviour (cf. @{cenelec_term (unchecked) \<open>fault\<close>}).\<close>
|
||||
|
||||
Definition*[fault]
|
||||
\<open>defect, mistake or inaccuracy which could result in failure or in a deviation
|
||||
|
@ -521,9 +525,11 @@ text\<open>Figure 3 in Chapter 5: Illustrative Development Lifecycle 1\<close>
|
|||
|
||||
text\<open>Global Overview\<close>
|
||||
|
||||
(*
|
||||
figure*[fig3::figure, relative_width="100",
|
||||
src="''examples/CENELEC_50128/mini_odo/document/figures/CENELEC-Fig.3-docStructure.png''"]
|
||||
\<open>Illustrative Development Lifecycle 1\<close>
|
||||
*)
|
||||
|
||||
text\<open>Actually, the Figure 4 in Chapter 5: Illustrative Development Lifecycle 2 is more fidele
|
||||
to the remaining document: Software Architecture and Design phases are merged, like in 7.3.\<close>
|
||||
|
@ -614,9 +620,10 @@ doc_class cenelec_report = text_element +
|
|||
invariant must_be_chapter :: "text_element.level \<sigma> = Some(0)"
|
||||
invariant three_eyes_prcpl:: " written_by \<sigma> \<noteq> fst_check \<sigma>
|
||||
\<and> written_by \<sigma> \<noteq> snd_check \<sigma>"
|
||||
|
||||
|
||||
(*
|
||||
text\<open>see \<^figure>\<open>fig3\<close> and Fig 4 in Chapter 5: Illustrative Development Lifecycle 2\<close>
|
||||
|
||||
*)
|
||||
doc_class external_specification =
|
||||
phase :: "phase" <= "SYSDEV_ext"
|
||||
|
||||
|
@ -1007,14 +1014,16 @@ ML\<open>
|
|||
fun check_sil oid _ ctxt =
|
||||
let
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
|
||||
val DOF_core.Instance {value = monitor_record_value, ...} =
|
||||
DOF_core.get_instance_global oid (Context.theory_of ctxt)
|
||||
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||
fun check_sil'' [] = true
|
||||
| check_sil'' (x::xs) =
|
||||
let
|
||||
val (_, doc_oid) = x
|
||||
val doc_record_value = #value (the (DOF_core.get_object_local doc_oid ctxt'))
|
||||
val DOF_core.Instance {value = doc_record_value, ...} =
|
||||
DOF_core.get_instance_global doc_oid (Context.theory_of ctxt)
|
||||
val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
|
||||
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext
|
||||
in
|
||||
|
@ -1026,27 +1035,37 @@ fun check_sil oid _ ctxt =
|
|||
in check_sil'' traces end
|
||||
\<close>
|
||||
|
||||
setup\<open>DOF_core.update_class_invariant "CENELEC_50128.monitor_SIL0" check_sil\<close>
|
||||
setup\<open>
|
||||
(fn thy =>
|
||||
let val ctxt = Proof_Context.init_global thy
|
||||
val cid = "monitor_SIL0"
|
||||
val binding = DOF_core.binding_from_onto_class_pos cid thy
|
||||
val cid_long = DOF_core.get_onto_class_name_global cid thy
|
||||
in DOF_core.add_ml_invariant binding (DOF_core.make_ml_invariant (check_sil, cid_long)) thy end)
|
||||
\<close>
|
||||
|
||||
text\<open>
|
||||
A more generic example of check_sil which can be generalized:
|
||||
A more generic example of check\_sil which can be generalized:
|
||||
it is decoupled from the CENELEC current implementation
|
||||
but is much less efficient regarding time computation by relying on Isabelle evaluation mechanism.\<close>
|
||||
ML\<open>
|
||||
fun check_sil_slow oid _ ctxt =
|
||||
let
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
|
||||
val monitor_cid = #cid (the (DOF_core.get_object_local oid ctxt'))
|
||||
val DOF_core.Instance {value = monitor_record_value, ...} =
|
||||
DOF_core.get_instance_global oid (Context.theory_of ctxt)
|
||||
val DOF_core.Instance {cid = monitor_cid, ...} =
|
||||
DOF_core.get_instance_global oid (Context.theory_of ctxt)
|
||||
val monitor_sil_typ = (Syntax.read_typ ctxt' monitor_cid) --> @{typ "sil"}
|
||||
val monitor_sil = Value_Command.value ctxt'
|
||||
(Const("CENELEC_50128.monitor_SIL.sil", monitor_sil_typ) $ monitor_record_value)
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||
fun check_sil' [] = true
|
||||
| check_sil' (x::xs) =
|
||||
let
|
||||
val (doc_cid, doc_oid) = x
|
||||
val doc_record_value = #value (the (DOF_core.get_object_local doc_oid ctxt'))
|
||||
val DOF_core.Instance {value = doc_record_value, ...} =
|
||||
DOF_core.get_instance_global doc_oid (Context.theory_of ctxt)
|
||||
val doc_sil_typ = (Syntax.read_typ ctxt' doc_cid) --> @{typ "sil"}
|
||||
val doc_sil = Value_Command.value ctxt'
|
||||
(Const ("CENELEC_50128.cenelec_document.sil", doc_sil_typ) $ doc_record_value)
|
||||
|
@ -1059,20 +1078,25 @@ fun check_sil_slow oid _ ctxt =
|
|||
in check_sil' traces end
|
||||
\<close>
|
||||
|
||||
(*setup\<open>DOF_core.update_class_invariant "CENELEC_50128.monitor_SIL0" check_sil_slow\<close>*)
|
||||
(*setup\<open>
|
||||
(fn thy =>
|
||||
let val ctxt = Proof_Context.init_global thy
|
||||
val binding = DOF_core.binding_from_onto_class_pos "monitor_SIL0" thy
|
||||
in DOF_core.add_ml_invariant binding check_sil_slow thy end)
|
||||
\<close>*)
|
||||
|
||||
(* As traces of monitor instances (docitems) are updated each time an instance is declared
|
||||
(with text*, section*, etc.), invariants checking functions which use traces must
|
||||
be declared as lazy invariants, to be checked only when closing a monitor, i.e.,
|
||||
after the monitor traces are populated.
|
||||
(with text*, section*, etc.), invariants checking functions which check the full list of traces
|
||||
must be declared as lazy invariants, to be checked only when closing a monitor, i.e.,
|
||||
after all the monitor traces are populated.
|
||||
*)
|
||||
ML\<open>
|
||||
fun check_required_documents oid _ ctxt =
|
||||
let
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
val {monitor_tab,...} = DOF_core.get_data ctxt'
|
||||
val {accepted_cids, ...} = the (Symtab.lookup monitor_tab oid)
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid \<^here> \<^here>
|
||||
val DOF_core.Monitor_Info {accepted_cids, ...} =
|
||||
DOF_core.get_monitor_info_global oid (Context.theory_of ctxt)
|
||||
val traces = AttributeAccess.compute_trace_ML ctxt oid NONE \<^here>
|
||||
fun check_required_documents' [] = true
|
||||
| check_required_documents' (cid::cids) =
|
||||
if exists (fn (doc_cid, _) => equal cid doc_cid) traces
|
||||
|
@ -1080,7 +1104,8 @@ fun check_required_documents oid _ ctxt =
|
|||
else
|
||||
let
|
||||
val ctxt' = Proof_Context.init_global(Context.theory_of ctxt)
|
||||
val monitor_record_value = #value (the (DOF_core.get_object_local oid ctxt'))
|
||||
val DOF_core.Instance {value = monitor_record_value, ...} =
|
||||
DOF_core.get_instance_global oid (Context.theory_of ctxt)
|
||||
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
|
||||
in error ("A " ^ cid ^ " cenelec document is required with "
|
||||
^ Syntax.string_of_term ctxt' monitor_sil)
|
||||
|
@ -1088,7 +1113,15 @@ fun check_required_documents oid _ ctxt =
|
|||
in check_required_documents' accepted_cids end
|
||||
\<close>
|
||||
|
||||
setup\<open>DOF_core.update_class_lazy_invariant "CENELEC_50128.monitor_SIL0" check_required_documents\<close>
|
||||
setup\<open>
|
||||
fn thy =>
|
||||
let val ctxt = Proof_Context.init_global thy
|
||||
val cid = "monitor_SIL0"
|
||||
val binding = DOF_core.binding_from_onto_class_pos cid thy
|
||||
val cid_long = DOF_core.get_onto_class_name_global cid thy
|
||||
in DOF_core.add_closing_ml_invariant binding
|
||||
(DOF_core.make_ml_invariant (check_required_documents, cid_long)) thy end
|
||||
\<close>
|
||||
|
||||
(* Test pattern matching for the records of the current CENELEC implementation classes,
|
||||
and used by checking functions.
|
||||
|
@ -1099,11 +1132,11 @@ text*[MonitorPatternMatchingTest::monitor_SIL0]\<open>\<close>
|
|||
text*[CenelecClassPatternMatchingTest::SQAP, sil = "SIL0"]\<open>\<close>
|
||||
ML\<open>
|
||||
val thy = @{theory}
|
||||
val monitor_record_value =
|
||||
#value (the (DOF_core.get_object_global "MonitorPatternMatchingTest" thy))
|
||||
val DOF_core.Instance {value = monitor_record_value, ...} =
|
||||
DOF_core.get_instance_global "MonitorPatternMatchingTest" thy
|
||||
val Const _ $ _ $ monitor_sil $ _ = monitor_record_value
|
||||
val doc_record_value = #value (the (DOF_core.get_object_global
|
||||
"CenelecClassPatternMatchingTest" thy))
|
||||
val DOF_core.Instance {value = doc_record_value, ...} =
|
||||
DOF_core.get_instance_global "CenelecClassPatternMatchingTest" thy
|
||||
val Const _ $ _ $ _ $ _ $ _ $ cenelec_document_ext = doc_record_value
|
||||
val Const _ $ _ $ _ $ doc_sil $ _ $ _ $ _ $ _ $ _ $ _ = cenelec_document_ext
|
||||
\<close>
|
||||
|
@ -1236,15 +1269,16 @@ doc_class test_documentation = (* OUTDATED ? *)
|
|||
|
||||
|
||||
section\<open>Global Documentation Structure\<close>
|
||||
|
||||
(*<<*)
|
||||
doc_class global_documentation_structure = text_element +
|
||||
level :: "int option" <= "Some(-1::int)" \<comment> \<open>document must be a chapter\<close>
|
||||
accepts "SYSREQS ~~ \<comment> \<open>system_requirements_specification\<close>
|
||||
SYSSREQS ~~ \<comment> \<open>system_safety_requirements_specification\<close>
|
||||
SYSAD ~~ \<comment> \<open>system_architecture description\<close>
|
||||
accepts "SYSREQS ~~ \<comment> \<open>system requiremens specification\<close>
|
||||
SYSSREQS ~~ \<comment> \<open>system safety requirements specification\<close>
|
||||
SYSAD ~~ \<comment> \<open>system architecture description\<close>
|
||||
SYSS_pl ~~ \<comment> \<open>system safety plan\<close>
|
||||
(SWRS || OSWTS) " \<comment> \<open>software requirements specification OR
|
||||
overall software test specification\<close>
|
||||
(*>>*)
|
||||
(* MORE TO COME : *)
|
||||
|
||||
section\<open> META : Testing and Validation \<close>
|
||||
|
@ -1252,10 +1286,10 @@ section\<open> META : Testing and Validation \<close>
|
|||
text\<open>Test : @{semi_formal_content \<open>COTS\<close>}\<close>
|
||||
|
||||
ML
|
||||
\<open> DOF_core.read_cid_global @{theory} "requirement";
|
||||
DOF_core.read_cid_global @{theory} "SRAC";
|
||||
DOF_core.is_defined_cid_global "SRAC" @{theory};
|
||||
DOF_core.is_defined_cid_global "EC" @{theory}; \<close>
|
||||
\<open> DOF_core.get_onto_class_name_global "requirement" @{theory};
|
||||
DOF_core.get_onto_class_name_global "SRAC" @{theory};
|
||||
DOF_core.get_onto_class_global "SRAC" @{theory};
|
||||
DOF_core.get_onto_class_global "EC" @{theory}; \<close>
|
||||
|
||||
ML
|
||||
\<open> DOF_core.is_subclass @{context} "CENELEC_50128.EC" "CENELEC_50128.EC";
|
||||
|
@ -1264,18 +1298,19 @@ ML
|
|||
DOF_core.is_subclass @{context} "CENELEC_50128.EC" "CENELEC_50128.test_requirement"; \<close>
|
||||
|
||||
ML
|
||||
\<open> val {docobj_tab={maxano, tab=ref_tab},docclass_tab=class_tab,...} = DOF_core.get_data @{context};
|
||||
Symtab.dest ref_tab;
|
||||
Symtab.dest class_tab; \<close>
|
||||
\<open> val ref_tab = DOF_core.get_instances \<^context>
|
||||
val docclass_tab = DOF_core.get_onto_classes @{context};
|
||||
Name_Space.dest_table ref_tab;
|
||||
Name_Space.dest_table docclass_tab; \<close>
|
||||
|
||||
ML
|
||||
\<open> val internal_data_of_SRAC_definition = DOF_core.get_attributes_local "SRAC" @{context} \<close>
|
||||
|
||||
ML
|
||||
\<open> DOF_core.read_cid_global @{theory} "requirement";
|
||||
\<open> DOF_core.get_onto_class_name_global "requirement" @{theory};
|
||||
Syntax.parse_typ @{context} "requirement";
|
||||
val Type(t,_) = Syntax.parse_typ @{context} "requirement" handle ERROR _ => dummyT;
|
||||
Syntax.read_typ @{context} "hypothesis" handle _ => dummyT;
|
||||
Proof_Context.init_global; \<close>
|
||||
|
||||
end
|
||||
end
|
|
@ -0,0 +1,397 @@
|
|||
(*************************************************************************
|
||||
* Copyright (C)
|
||||
* 2019-2023 The University of Exeter
|
||||
* 2018-2023 The University of Paris-Saclay
|
||||
* 2018 The University of Sheffield
|
||||
*
|
||||
* License:
|
||||
* This program can be redistributed and/or modified under the terms
|
||||
* of the 2-clause BSD-style license.
|
||||
*
|
||||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*************************************************************************)
|
||||
|
||||
(*<<*)
|
||||
theory
|
||||
CENELEC_50128_Documentation
|
||||
imports
|
||||
CENELEC_50128
|
||||
|
||||
begin
|
||||
|
||||
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
|
||||
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
|
||||
define_shortcut* TeXLive \<rightleftharpoons> \<open>\TeXLive\<close>
|
||||
BibTeX \<rightleftharpoons> \<open>\BibTeX{}\<close>
|
||||
LaTeX \<rightleftharpoons> \<open>\LaTeX{}\<close>
|
||||
TeX \<rightleftharpoons> \<open>\TeX{}\<close>
|
||||
pdf \<rightleftharpoons> \<open>PDF\<close>
|
||||
|
||||
ML\<open>
|
||||
|
||||
fun boxed_text_antiquotation name (* redefined in these more abstract terms *) =
|
||||
DOF_lib.gen_text_antiquotation name DOF_lib.report_text
|
||||
(fn ctxt => DOF_lib.string_2_text_antiquotation ctxt
|
||||
#> DOF_lib.enclose_env false ctxt "isarbox")
|
||||
|
||||
val neant = K(Latex.text("",\<^here>))
|
||||
|
||||
fun boxed_theory_text_antiquotation name (* redefined in these more abstract terms *) =
|
||||
DOF_lib.gen_text_antiquotation name DOF_lib.report_theory_text
|
||||
(fn ctxt => DOF_lib.string_2_theory_text_antiquotation ctxt
|
||||
#> DOF_lib.enclose_env false ctxt "isarbox"
|
||||
(* #> neant *)) (*debugging *)
|
||||
|
||||
fun boxed_sml_text_antiquotation name =
|
||||
DOF_lib.gen_text_antiquotation name (K(K()))
|
||||
(fn ctxt => Input.source_content
|
||||
#> Latex.text
|
||||
#> DOF_lib.enclose_env true ctxt "sml")
|
||||
(* the simplest conversion possible *)
|
||||
|
||||
fun boxed_pdf_antiquotation name =
|
||||
DOF_lib.gen_text_antiquotation name (K(K()))
|
||||
(fn ctxt => Input.source_content
|
||||
#> Latex.text
|
||||
#> DOF_lib.enclose_env true ctxt "out")
|
||||
(* the simplest conversion possible *)
|
||||
|
||||
fun boxed_latex_antiquotation name =
|
||||
DOF_lib.gen_text_antiquotation name (K(K()))
|
||||
(fn ctxt => Input.source_content
|
||||
#> Latex.text
|
||||
#> DOF_lib.enclose_env true ctxt "ltx")
|
||||
(* the simplest conversion possible *)
|
||||
|
||||
fun boxed_bash_antiquotation name =
|
||||
DOF_lib.gen_text_antiquotation name (K(K()))
|
||||
(fn ctxt => Input.source_content
|
||||
#> Latex.text
|
||||
#> DOF_lib.enclose_env true ctxt "bash")
|
||||
(* the simplest conversion possible *)
|
||||
\<close>
|
||||
|
||||
setup\<open>(* std_text_antiquotation \<^binding>\<open>my_text\<close> #> *)
|
||||
boxed_text_antiquotation \<^binding>\<open>boxed_text\<close> #>
|
||||
(* std_text_antiquotation \<^binding>\<open>my_cartouche\<close> #> *)
|
||||
boxed_text_antiquotation \<^binding>\<open>boxed_cartouche\<close> #>
|
||||
(* std_theory_text_antiquotation \<^binding>\<open>my_theory_text\<close>#> *)
|
||||
boxed_theory_text_antiquotation \<^binding>\<open>boxed_theory_text\<close> #>
|
||||
|
||||
boxed_sml_text_antiquotation \<^binding>\<open>boxed_sml\<close> #>
|
||||
boxed_pdf_antiquotation \<^binding>\<open>boxed_pdf\<close> #>
|
||||
boxed_latex_antiquotation \<^binding>\<open>boxed_latex\<close>#>
|
||||
boxed_bash_antiquotation \<^binding>\<open>boxed_bash\<close>
|
||||
\<close>
|
||||
|
||||
|
||||
|
||||
(*>>*)
|
||||
|
||||
section*[cenelec_onto::example]\<open>Writing Certification Documents \<^boxed_theory_text>\<open>CENELEC_50128\<close>\<close>
|
||||
subsection\<open>The CENELEC 50128 Example\<close>
|
||||
text\<open>
|
||||
The ontology \<^verbatim>\<open>CENELEC_50128\<close>\index{ontology!CENELEC\_50128} is a small ontology modeling
|
||||
documents for a certification following CENELEC 50128~@{cite "boulanger:cenelec-50128:2015"}.
|
||||
The \<^isadof> distribution contains a small example using the ontology ``CENELEC\_50128'' in
|
||||
the directory \nolinkurl{examples/CENELEC_50128/mini_odo/}. You can inspect/edit the
|
||||
integrated source example by either
|
||||
\<^item> starting Isabelle/jEdit using your graphical user interface (\<^eg>, by clicking on the
|
||||
Isabelle-Icon provided by the Isabelle installation) and loading the file
|
||||
\nolinkurl{examples/CENELEC_50128/mini_odo/mini_odo.thy}.
|
||||
\<^item> starting Isabelle/jEdit from the command line by calling:
|
||||
|
||||
@{boxed_bash [display]\<open>ë\prompt{\isadofdirn}ë
|
||||
isabelle jedit examples/CENELEC_50128/mini_odo/mini_odo.thy \<close>}
|
||||
\<close>
|
||||
text\<open>\<^noindent> Finally, you
|
||||
\<^item> can build the \<^pdf>-document by calling:
|
||||
@{boxed_bash [display]\<open>ë\prompt{\isadofdirn}ë isabelle build mini_odo \<close>}
|
||||
\<close>
|
||||
|
||||
subsection\<open>Modeling CENELEC 50128\<close>
|
||||
|
||||
text\<open>
|
||||
Documents to be provided in formal certifications (such as CENELEC
|
||||
50128~@{cite "boulanger:cenelec-50128:2015"} or Common Criteria~@{cite "cc:cc-part3:2006"}) can
|
||||
much profit from the control of ontological consistency: a substantial amount of the work
|
||||
of evaluators in formal certification processes consists in tracing down the links from
|
||||
requirements over assumptions down to elements of evidence, be it in form of semi-formal
|
||||
documentation, models, code, or tests. In a certification process, traceability becomes a major
|
||||
concern; and providing mechanisms to ensure complete traceability already at the development of
|
||||
the integrated source can in our view increase the speed and reduce the risk certification
|
||||
processes. Making the link-structure machine-checkable, be it between requirements, assumptions,
|
||||
their implementation and their discharge by evidence (be it tests, proofs, or authoritative
|
||||
arguments), has the potential in our view to decrease the cost of software developments
|
||||
targeting certifications.
|
||||
|
||||
As in many other cases, formal certification documents come with an own terminology and pragmatics
|
||||
of what has to be demonstrated and where, and how the traceability of requirements through
|
||||
design-models over code to system environment assumptions has to be assured.
|
||||
|
||||
In the sequel, we present a simplified version of an ontological model used in a
|
||||
case-study~@{cite "bezzecchi.ea:making:2018"}. We start with an introduction of the concept of
|
||||
requirement:
|
||||
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class requirement = long_name :: "string option"
|
||||
|
||||
doc_class hypothesis = requirement +
|
||||
hyp_type :: hyp_type <= physical (* default *)
|
||||
|
||||
datatype ass_kind = informal | semiformal | formal
|
||||
|
||||
doc_class assumption = requirement +
|
||||
assumption_kind :: ass_kind <= informal
|
||||
\<close>}
|
||||
|
||||
Such ontologies can be enriched by larger explanations and examples, which may help
|
||||
the team of engineers substantially when developing the central document for a certification,
|
||||
like an explication of what is precisely the difference between an \<^typ>\<open>hypothesis\<close> and an
|
||||
\<^typ>\<open>assumption\<close> in the context of the evaluation standard. Since the PIDE makes for each
|
||||
document class its definition available by a simple mouse-click, this kind on meta-knowledge
|
||||
can be made far more accessible during the document evolution.
|
||||
|
||||
For example, the term of category \<^typ>\<open>assumption\<close> is used for domain-specific assumptions.
|
||||
It has \<^const>\<open>formal\<close>, \<^const>\<open>semiformal\<close> and \<^const>\<open>informal\<close> sub-categories. They have to be
|
||||
tracked and discharged by appropriate validation procedures within a
|
||||
certification process, be it by test or proof. It is different from a \<^typ>\<open>hypothesis\<close>, which is
|
||||
globally assumed and accepted.
|
||||
|
||||
In the sequel, the category \<^typ>\<open>exported_constraint\<close> (or \<^typ>\<open>EC\<close> for short)
|
||||
is used for formal assumptions, that arise during the analysis,
|
||||
design or implementation and have to be tracked till the final
|
||||
evaluation target, and discharged by appropriate validation procedures
|
||||
within the certification process, be it by test or proof. A particular class of interest
|
||||
is the category \<^typ>\<open>safety_related_application_condition\<close> (or \<^typ>\<open>SRAC\<close>
|
||||
for short) which is used for \<^typ>\<open>EC\<close>'s that establish safety properties
|
||||
of the evaluation target. Their traceability throughout the certification
|
||||
is therefore particularly critical. This is naturally modeled as follows:
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class EC = assumption +
|
||||
assumption_kind :: ass_kind <= (*default *) formal
|
||||
|
||||
doc_class SRAC = EC +
|
||||
assumption_kind :: ass_kind <= (*default *) formal
|
||||
\<close>}
|
||||
|
||||
We now can, \<^eg>, write
|
||||
|
||||
@{boxed_theory_text [display]\<open>
|
||||
text*[ass123::SRAC]\<open>
|
||||
The overall sampling frequence of the odometer subsystem is therefore
|
||||
14 khz, which includes sampling, computing and result communication
|
||||
times \ldots
|
||||
\<close>
|
||||
\<close>}
|
||||
|
||||
This will be shown in the \<^pdf> as follows:
|
||||
\<close>
|
||||
text*[ass123::SRAC] \<open> The overall sampling frequency of the odometer
|
||||
subsystem is therefore 14 khz, which includes sampling, computing and
|
||||
result communication times \ldots \<close>
|
||||
|
||||
text\<open>Note that this \<^pdf>-output is the result of a specific setup for \<^typ>\<open>SRAC\<close>s.\<close>
|
||||
|
||||
subsection*[ontopide::technical]\<open>Editing Support for CENELEC 50128\<close>
|
||||
figure*[figfig3::figure,relative_width="95",file_src="''figures/antiquotations-PIDE.png''"]
|
||||
\<open> Standard antiquotations referring to theory elements.\<close>
|
||||
text\<open> The corresponding view in @{docitem \<open>figfig3\<close>} shows core part of a document
|
||||
conforming to the \<^verbatim>\<open>CENELEC_50128\<close> ontology. The first sample shows standard Isabelle antiquotations
|
||||
@{cite "wenzel:isabelle-isar:2020"} into formal entities of a theory. This way, the informal parts
|
||||
of a document get ``formal content'' and become more robust under change.\<close>
|
||||
|
||||
figure*[figfig5::figure, relative_width="95", file_src="''figures/srac-definition.png''"]
|
||||
\<open> Defining a \<^typ>\<open>SRAC\<close> in the integrated source ... \<close>
|
||||
|
||||
figure*[figfig7::figure, relative_width="95", file_src="''figures/srac-as-es-application.png''"]
|
||||
\<open> Using a \<^typ>\<open>SRAC\<close> as \<^typ>\<open>EC\<close> document element. \<close>
|
||||
text\<open> The subsequent sample in @{figure \<open>figfig5\<close>} shows the definition of a
|
||||
\<^emph>\<open>safety-related application condition\<close>, a side-condition of a theorem which
|
||||
has the consequence that a certain calculation must be executed sufficiently fast on an embedded
|
||||
device. This condition can not be established inside the formal theory but has to be
|
||||
checked by system integration tests. Now we reference in @{figure \<open>figfig7\<close>} this
|
||||
safety-related condition; however, this happens in a context where general \<^emph>\<open>exported constraints\<close>
|
||||
are listed. \<^isadof>'s checks and establishes that this is legal in the given ontology.
|
||||
\<close>
|
||||
|
||||
text\<open>
|
||||
\<^item> \<^theory_text>\<open>@{term_ \<open>term\<close> }\<close> parses and type-checks \<open>term\<close> with term antiquotations,
|
||||
for instance \<^theory_text>\<open>@{term_ \<open>@{cenelec-term \<open>FT\<close>}\<close>}\<close> will parse and check
|
||||
that \<open>FT\<close> is indeed an instance of the class \<^typ>\<open>cenelec_term\<close>,
|
||||
\<close>
|
||||
|
||||
subsection\<open>A Domain-Specific Ontology: \<^verbatim>\<open>CENELEC_50128\<close>\<close>
|
||||
(*<*)
|
||||
ML\<open>val toLaTeX = String.translate (fn c => if c = #"_" then "\\_" else String.implode[c])\<close>
|
||||
ML\<open>writeln (DOF_core.print_doc_class_tree
|
||||
@{context} (fn (n,l) => true (* String.isPrefix "technical_report" l
|
||||
orelse String.isPrefix "Isa_COL" l *))
|
||||
toLaTeX)\<close>
|
||||
(*>*)
|
||||
text\<open> The \<^verbatim>\<open>CENELEC_50128\<close> ontology in \<^theory>\<open>Isabelle_DOF-Ontologies.CENELEC_50128\<close>
|
||||
is an example of a domain-specific ontology.
|
||||
It is based on \<^verbatim>\<open>technical_report\<close> since we assume that this kind of format will be most
|
||||
appropriate for this type of long-and-tedious documents,
|
||||
|
||||
%
|
||||
\begin{center}
|
||||
\begin{minipage}{.9\textwidth}\footnotesize
|
||||
\dirtree{%
|
||||
.0 .
|
||||
.1 CENELEC\_50128.judgement\DTcomment{...}.
|
||||
.1 CENELEC\_50128.test\_item\DTcomment{...}.
|
||||
.2 CENELEC\_50128.test\_case\DTcomment{...}.
|
||||
.2 CENELEC\_50128.test\_tool\DTcomment{...}.
|
||||
.2 CENELEC\_50128.test\_result\DTcomment{...}.
|
||||
.2 CENELEC\_50128.test\_adm\_role\DTcomment{...}.
|
||||
.2 CENELEC\_50128.test\_environment\DTcomment{...}.
|
||||
.2 CENELEC\_50128.test\_requirement\DTcomment{...}.
|
||||
.2 CENELEC\_50128.test\_specification\DTcomment{...}.
|
||||
.1 CENELEC\_50128.objectives\DTcomment{...}.
|
||||
.1 CENELEC\_50128.design\_item\DTcomment{...}.
|
||||
.2 CENELEC\_50128.interface\DTcomment{...}.
|
||||
.1 CENELEC\_50128.sub\_requirement\DTcomment{...}.
|
||||
.1 CENELEC\_50128.test\_documentation\DTcomment{...}.
|
||||
.1 Isa\_COL.text\_element\DTcomment{...}.
|
||||
.2 CENELEC\_50128.requirement\DTcomment{...}.
|
||||
.3 CENELEC\_50128.TC\DTcomment{...}.
|
||||
.3 CENELEC\_50128.FnI\DTcomment{...}.
|
||||
.3 CENELEC\_50128.SIR\DTcomment{...}.
|
||||
.3 CENELEC\_50128.CoAS\DTcomment{...}.
|
||||
.3 CENELEC\_50128.HtbC\DTcomment{...}.
|
||||
.3 CENELEC\_50128.SILA\DTcomment{...}.
|
||||
.3 CENELEC\_50128.assumption\DTcomment{...}.
|
||||
.4 CENELEC\_50128.AC\DTcomment{...}.
|
||||
.5 CENELEC\_50128.EC\DTcomment{...}.
|
||||
.6 CENELEC\_50128.SRAC\DTcomment{...}.
|
||||
.3 CENELEC\_50128.hypothesis\DTcomment{...}.
|
||||
.4 CENELEC\_50128.security\_hyp\DTcomment{...}.
|
||||
.3 CENELEC\_50128.safety\_requirement\DTcomment{...}.
|
||||
.2 CENELEC\_50128.cenelec\_text\DTcomment{...}.
|
||||
.3 CENELEC\_50128.SWAS\DTcomment{...}.
|
||||
.3 [...].
|
||||
.2 scholarly\_paper.text\_section\DTcomment{...}.
|
||||
.3 scholarly\_paper.technical\DTcomment{...}.
|
||||
.4 scholarly\_paper.math\_content\DTcomment{...}.
|
||||
.5 CENELEC\_50128.semi\_formal\_content\DTcomment{...}.
|
||||
.1 ...
|
||||
}
|
||||
\end{minipage}
|
||||
\end{center}
|
||||
\<close>
|
||||
|
||||
(* TODO : Rearrange ontology hierarchies. *)
|
||||
|
||||
subsubsection\<open>Examples\<close>
|
||||
|
||||
text\<open>
|
||||
The category ``exported constraint (EC)'' is, in the file
|
||||
\<^file>\<open>CENELEC_50128.thy\<close> defined as follows:
|
||||
|
||||
@{boxed_theory_text [display]\<open>
|
||||
doc_class requirement = text_element +
|
||||
long_name :: "string option"
|
||||
is_concerned :: "role set"
|
||||
doc_class assumption = requirement +
|
||||
assumption_kind :: ass_kind <= informal
|
||||
doc_class AC = assumption +
|
||||
is_concerned :: "role set" <= "UNIV"
|
||||
doc_class EC = AC +
|
||||
assumption_kind :: ass_kind <= (*default *) formal
|
||||
\<close>}
|
||||
\<close>
|
||||
text\<open>
|
||||
We now define the document representations, in the file
|
||||
\<^file>\<open>DOF-CENELEC_50128.sty\<close>. Let us assume that we want to
|
||||
register the definition of EC's in a dedicated table of contents (\<^boxed_latex>\<open>tos\<close>)
|
||||
and use an earlier defined environment \inlineltx|\begin{EC}...\end{EC}| for their graphical
|
||||
representation. Note that the \inlineltx|\newisadof{}[]{}|-command requires the
|
||||
full-qualified names, \<^eg>, \<^boxed_theory_text>\<open>text.CENELEC_50128.EC\<close> for the document class and
|
||||
\<^boxed_theory_text>\<open>CENELEC_50128.requirement.long_name\<close> for the attribute \<^const>\<open>long_name\<close>,
|
||||
inherited from the document class \<^typ>\<open>requirement\<close>. The representation of \<^typ>\<open>EC\<close>'s
|
||||
can now be defined as follows:
|
||||
% TODO:
|
||||
% Explain the text qualifier of the long_name text.CENELEC_50128.EC
|
||||
|
||||
\begin{ltx}
|
||||
\newisadof{text.CENELEC_50128.EC}%
|
||||
[label=,type=%
|
||||
,Isa_COL.text_element.level=%
|
||||
,Isa_COL.text_element.referentiable=%
|
||||
,Isa_COL.text_element.variants=%
|
||||
,CENELEC_50128.requirement.is_concerned=%
|
||||
,CENELEC_50128.requirement.long_name=%
|
||||
,CENELEC_50128.EC.assumption_kind=][1]{%
|
||||
\begin{isamarkuptext}%
|
||||
\ifthenelse{\equal{\commandkey{CENELEC_50128.requirement.long_name}}{}}{%
|
||||
% If long_name is not defined, we only create an entry in the table tos
|
||||
% using the auto-generated number of the EC
|
||||
\begin{EC}%
|
||||
\addxcontentsline{tos}{chapter}[]{\autoref{\commandkey{label}}}%
|
||||
}{%
|
||||
% If long_name is defined, we use the long_name as title in the
|
||||
% layout of the EC, in the table "tos" and as index entry. .
|
||||
\begin{EC}[\commandkey{CENELEC_50128.requirement.long_name}]%
|
||||
\addxcontentsline{toe}{chapter}[]{\autoref{\commandkey{label}}: %
|
||||
\commandkey{CENELEC_50128.requirement.long_name}}%
|
||||
\DOFindex{EC}{\commandkey{CENELEC_50128.requirement.long_name}}%
|
||||
}%
|
||||
\label{\commandkey{label}}% we use the label attribute as anchor
|
||||
#1% The main text of the EC
|
||||
\end{EC}
|
||||
\end{isamarkuptext}%
|
||||
}
|
||||
\end{ltx}
|
||||
\<close>
|
||||
text\<open>
|
||||
For example, the @{docitem "ass123"} is mapped to
|
||||
|
||||
@{boxed_latex [display]
|
||||
\<open>\begin{isamarkuptext*}%
|
||||
[label = {ass122},type = {CENELEC_50128.SRAC},
|
||||
args={label = {ass122}, type = {CENELEC_50128.SRAC},
|
||||
CENELEC_50128.EC.assumption_kind = {formal}}
|
||||
] The overall sampling frequence of the odometer subsystem is therefore
|
||||
14 khz, which includes sampling, computing and result communication
|
||||
times ...
|
||||
\end{isamarkuptext*}\<close>}
|
||||
|
||||
This environment is mapped to a plain \<^LaTeX> command via:
|
||||
@{boxed_latex [display]
|
||||
\<open> \NewEnviron{isamarkuptext*}[1][]{\isaDof[env={text},#1]{\BODY}} \<close>}
|
||||
\<close>
|
||||
text\<open>
|
||||
For the command-based setup, \<^isadof> provides a dispatcher that selects the most specific
|
||||
implementation for a given \<^boxed_theory_text>\<open>doc_class\<close>:
|
||||
|
||||
@{boxed_latex [display]
|
||||
\<open>%% The Isabelle/DOF dispatcher:
|
||||
\newkeycommand+[\|]\isaDof[env={UNKNOWN},label=,type={dummyT},args={}][1]{%
|
||||
\ifcsname isaDof.\commandkey{type}\endcsname%
|
||||
\csname isaDof.\commandkey{type}\endcsname%
|
||||
[label=\commandkey{label},\commandkey{args}]{#1}%
|
||||
\else\relax\fi%
|
||||
\ifcsname isaDof.\commandkey{env}.\commandkey{type}\endcsname%
|
||||
\csname isaDof.\commandkey{env}.\commandkey{type}\endcsname%
|
||||
[label=\commandkey{label},\commandkey{args}]{#1}%
|
||||
\else%
|
||||
\message{Isabelle/DOF: Using default LaTeX representation for concept %
|
||||
"\commandkey{env}.\commandkey{type}".}%
|
||||
\ifcsname isaDof.\commandkey{env}\endcsname%
|
||||
\csname isaDof.\commandkey{env}\endcsname%
|
||||
[label=\commandkey{label}]{#1}%
|
||||
\else%
|
||||
\errmessage{Isabelle/DOF: No LaTeX representation for concept %
|
||||
"\commandkey{env}.\commandkey{type}" defined and no default %
|
||||
definition for "\commandkey{env}" available either.}%
|
||||
\fi%
|
||||
\fi%
|
||||
}\<close>}
|
||||
\<close>
|
||||
|
||||
|
||||
|
||||
(*<<*)
|
||||
end
|
||||
(*>>*)
|
|
@ -0,0 +1,220 @@
|
|||
%% Copyright (C) 2019 University of Exeter
|
||||
%% 2018 University of Paris-Saclay
|
||||
%% 2018 The University of Sheffield
|
||||
%%
|
||||
%% License:
|
||||
%% This program can be redistributed and/or modified under the terms
|
||||
%% of the LaTeX Project Public License Distributed from CTAN
|
||||
%% archives in directory macros/latex/base/lppl.txt; either
|
||||
%% version 1.3c of the License, or (at your option) any later version.
|
||||
%% OR
|
||||
%% The 2-clause BSD-style license.
|
||||
%%
|
||||
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
|
||||
|
||||
\NeedsTeXFormat{LaTeX2e}\relax
|
||||
\ProvidesPackage{DOF-cenelec_50128}
|
||||
[00/00/0000 Document-Type Support Framework for Isabelle (CENELEC 50128).]
|
||||
|
||||
\RequirePackage{DOF-COL}
|
||||
\usepackage{etex}
|
||||
\ifdef{\reserveinserts}{\reserveinserts{28}}{}
|
||||
\usepackage[many]{tcolorbox}
|
||||
\usepackage{marginnote}
|
||||
|
||||
% Index setup
|
||||
\usepackage{index}
|
||||
\makeindex
|
||||
\AtEndDocument{\printindex}
|
||||
|
||||
\newcommand{\DOFindex}[2]{%
|
||||
\marginnote{\normalfont\textbf{#1}: #2}%
|
||||
\expandafter\index\expandafter{\expanded{#2 (#1)}}%
|
||||
}%
|
||||
|
||||
|
||||
%% SRAC
|
||||
\providecolor{SRAC}{named}{green}
|
||||
\ifcsdef{DeclareNewTOC}{%
|
||||
\DeclareNewTOC[%
|
||||
owner=\jobname,
|
||||
type=SRAC,%
|
||||
types=SRACs,%
|
||||
listname={List of SRACs}%
|
||||
]{tos}
|
||||
\setuptoc{tos}{chapteratlist}
|
||||
\AtEndEnvironment{frontmatter}{\listofSRACs}
|
||||
}{}
|
||||
|
||||
\newtheorem{SRAC}{SRAC}
|
||||
\tcolorboxenvironment{SRAC}{
|
||||
boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback={white!90!SRAC}
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{SRAC}
|
||||
,sharp corners
|
||||
,before skip=10pt
|
||||
,after skip=10pt
|
||||
,breakable
|
||||
}
|
||||
|
||||
\newcommand{\SRACautorefname}{SRAC}
|
||||
\newisadof{textDOTCENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRAC}%
|
||||
[label=,type=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTisUNDERSCOREconcerned=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTformalUNDERSCORErepr=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTassumptionUNDERSCOREkind=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTECDOTassumptionUNDERSCOREkind=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTassumptionDOTassumptionUNDERSCOREkind=%
|
||||
][1]{%
|
||||
\begin{isamarkuptext}%
|
||||
\ifthenelse{\equal{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}{}}{%
|
||||
\begin{SRAC}%
|
||||
\addxcontentsline{tos}{chapter}[]{\autoref{\commandkey{label}}}%
|
||||
}{%
|
||||
\begin{SRAC}[\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}]%
|
||||
\addxcontentsline{tos}{chapter}[]{\autoref{\commandkey{label}}: \commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
|
||||
\DOFindex{SRAC}{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
|
||||
}\label{\commandkey{label}}%
|
||||
#1%
|
||||
\end{SRAC}
|
||||
\end{isamarkuptext}%
|
||||
}
|
||||
|
||||
% EC
|
||||
\providecolor{EC}{named}{blue}
|
||||
\ifcsdef{DeclareNewTOC}{%
|
||||
\DeclareNewTOC[%
|
||||
owner=\jobname,
|
||||
type=EC,%
|
||||
types=ECs,%
|
||||
listname={List of ECs}%
|
||||
]{toe}
|
||||
\setuptoc{toe}{chapteratlist}
|
||||
\AtEndEnvironment{frontmatter}{\listofECs}
|
||||
}{}
|
||||
|
||||
\newtheorem{EC}{EC}
|
||||
\tcolorboxenvironment{EC}{
|
||||
boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback={white!90!EC}
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{EC}
|
||||
,sharp corners
|
||||
,before skip=10pt
|
||||
,after skip=10pt
|
||||
,breakable
|
||||
}
|
||||
|
||||
\newcommand{\ECautorefname}{EC}
|
||||
\newisadof{textDOTCENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTEC}%
|
||||
[label=,type=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTisUNDERSCOREconcerned=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTformalUNDERSCORErepr=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTassumptionUNDERSCOREkind=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTECDOTassumptionUNDERSCOREkind=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTassumptionDOTassumptionUNDERSCOREkind=%
|
||||
][1]{%
|
||||
\begin{isamarkuptext}%
|
||||
\ifthenelse{\equal{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}{}}{%
|
||||
\begin{EC}%
|
||||
\addxcontentsline{toe}{chapter}[]{\autoref{\commandkey{label}}}%
|
||||
}{%
|
||||
\begin{EC}[\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}]%
|
||||
\addxcontentsline{toe}{chapter}[]{\autoref{\commandkey{label}}: \commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
|
||||
\DOFindex{EC}{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
|
||||
}\label{\commandkey{label}}%
|
||||
#1%
|
||||
\end{EC}
|
||||
\end{isamarkuptext}%
|
||||
}
|
||||
|
||||
% assumptions
|
||||
\providecolor{assumption}{named}{orange}
|
||||
\newtheorem{assumption}{assumption}
|
||||
\tcolorboxenvironment{assumption}{
|
||||
boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback={white!90!assumption}
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{assumption}
|
||||
,sharp corners
|
||||
,before skip=10pt
|
||||
,after skip=10pt
|
||||
,breakable
|
||||
}
|
||||
|
||||
\newcommand{\assumptionautorefname}{assumption}
|
||||
\newisadof{textDOTCENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTassumption}%
|
||||
[label=,type=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTisUNDERSCOREconcerned=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTformalUNDERSCORErepr=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTassumptionUNDERSCOREkind=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTassumptionDOTassumptionUNDERSCOREkind=%
|
||||
][1]{%
|
||||
\begin{isamarkuptext}%
|
||||
\ifthenelse{\equal{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}{}}{%
|
||||
\begin{assumption}%
|
||||
}{%
|
||||
\begin{assumption}[\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}]%
|
||||
\DOFindex{assumption}{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
|
||||
}\label{\commandkey{label}}%
|
||||
#1%
|
||||
\end{assumption}
|
||||
\end{isamarkuptext}%
|
||||
}
|
||||
|
||||
|
||||
% hypotheses
|
||||
\providecolor{hypothesis}{named}{teal}
|
||||
\newtheorem{hypothesis}{hypothesis}
|
||||
\tcolorboxenvironment{hypothesis}{
|
||||
,boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback={white!90!hypothesis}
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{hypothesis}
|
||||
,sharp corners
|
||||
,before skip=10pt
|
||||
,after skip=10pt
|
||||
,breakable
|
||||
}
|
||||
|
||||
|
||||
\newcommand{\hypothesisautorefname}{hypothesis}
|
||||
\newisadof{textDOTCENELECUNDERSCOREFIVEZEROONETWOEIGHTDOThypothesis}%
|
||||
[label=,type=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTlevel=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTreferentiable=%
|
||||
,IsaUNDERSCORECOLDOTtextUNDERSCOREelementDOTvariants=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTisUNDERSCOREconcerned=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOTformalUNDERSCORErepr=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTSRACDOThypothesisUNDERSCOREkind=%
|
||||
,CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOThypothesisDOThypUNDERSCOREtype=%
|
||||
][1]{%
|
||||
\begin{isamarkuptext}%
|
||||
\ifthenelse{\equal{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}{}}{%
|
||||
\begin{hypothesis}%
|
||||
}{%
|
||||
\begin{hypothesis}[\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}]%
|
||||
\DOFindex{hypothesis}{\commandkey{CENELECUNDERSCOREFIVEZEROONETWOEIGHTDOTrequirementDOTlongUNDERSCOREname}}%
|
||||
}\label{\commandkey{label}}%
|
||||
#1%
|
||||
\end{hypothesis}
|
||||
\end{isamarkuptext}%
|
||||
}
|
|
@ -11,7 +11,7 @@
|
|||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*************************************************************************)
|
||||
|
||||
section\<open>A conceptual introduction into DOF and its features:\<close>
|
||||
chapter\<open>A conceptual introduction into DOF and its features:\<close>
|
||||
|
||||
theory
|
||||
Conceptual
|
||||
|
@ -20,23 +20,25 @@ imports
|
|||
"Isabelle_DOF.Isa_COL"
|
||||
begin
|
||||
|
||||
section\<open>Excursion: On the semantic consequences of this definition: \<close>
|
||||
|
||||
text\<open>Consider the following document class definition and its consequences:\<close>
|
||||
|
||||
doc_class A =
|
||||
level :: "int option"
|
||||
x :: int
|
||||
|
||||
subsection\<open>Excursion: On the semantic consequences of this definition: \<close>
|
||||
|
||||
text\<open>This class definition leads an implicit Isabelle/HOL \<^theory_text>\<open>record\<close> definition
|
||||
(cf. \<^url>\<open>https://isabelle.in.tum.de/dist/Isabelle2021/doc/isar-ref.pdf\<close>, chapter 11.6.).
|
||||
(cf. \<^url>\<open>https://isabelle.in.tum.de/doc/isar-ref.pdf\<close>, chapter 11.6.).
|
||||
Consequently, \<^theory_text>\<open>doc_class\<close>'es inherit the entire theory-infrastructure from Isabelle records:
|
||||
\<^enum> there is a HOL-type \<^typ>\<open>A\<close> and its extensible version \<^typ>\<open>'a A_scheme\<close>
|
||||
\<^enum> there are HOL-terms representing \<^emph>\<open>doc_class instances\<close> with the high-level syntax:
|
||||
\<^enum> there are HOL-terms representing \<^emph>\<open>doc\_class instances\<close> with the high-level syntax:
|
||||
\<^enum> \<^term>\<open>undefined\<lparr>level := Some (1::int), x := 5::int \<rparr> :: A\<close>
|
||||
(Note that this way to construct an instance is not necessarily computable
|
||||
\<^enum> \<^term>\<open>\<lparr>tag_attribute = X, level = Y, x = Z\<rparr> :: A\<close>
|
||||
\<^enum> \<^term>\<open>\<lparr>tag_attribute = X, level = Y, x = Z, \<dots> = M\<rparr> :: ('a A_scheme)\<close>
|
||||
\<^enum> there is an entire proof infra-structure allowing to reason about \<^emph>\<open>doc_class instances\<close>;
|
||||
\<^enum> there is an entire proof infra-structure allowing to reason about \<^emph>\<open>doc\_class instances\<close>;
|
||||
this involves the constructor, the selectors (representing the \<^emph>\<open>attributes\<close> in OO lingo)
|
||||
the update functions, the rules to establish equality and, if possible the code generator
|
||||
setups:
|
||||
|
@ -49,23 +51,61 @@ Consequently, \<^theory_text>\<open>doc_class\<close>'es inherit the entire theo
|
|||
\<^enum> @{thm [display] A.simps(6)}
|
||||
\<^enum> ...
|
||||
\<close>
|
||||
(* the generated theory of the \<^theory_text>\<open>doc_class\<close> A can be inspected, of course, by *)
|
||||
|
||||
text\<open>The generated theory of the \<^theory_text>\<open>doc_class\<close> A can be inspected, of course, by:\<close>
|
||||
find_theorems (60) name:Conceptual name:A
|
||||
|
||||
text\<open>A more abstract view on the state of the DOF machine can be found here:\<close>
|
||||
|
||||
print_doc_classes
|
||||
|
||||
print_doc_items
|
||||
|
||||
text\<open>... and an ML-level output:\<close>
|
||||
|
||||
ML\<open>
|
||||
val docitem_tab = DOF_core.get_instances \<^context>;
|
||||
val isa_transformer_tab = DOF_core.get_isa_transformers \<^context>;
|
||||
val docclass_tab = DOF_core.get_onto_classes \<^context>;
|
||||
|
||||
\<close>
|
||||
|
||||
ML\<open>
|
||||
Name_Space.dest_table docitem_tab;
|
||||
Name_Space.dest_table isa_transformer_tab;
|
||||
Name_Space.dest_table docclass_tab;
|
||||
\<close>
|
||||
text\<open>... or as ML assertion: \<close>
|
||||
ML\<open>
|
||||
@{assert} (Name_Space.dest_table docitem_tab = []);
|
||||
fun match ("Conceptual.A", (* the long-name *)
|
||||
DOF_core.Onto_Class {params, name, virtual,inherits_from=NONE,
|
||||
attribute_decl, rejectS=[],rex=[], invs=[]})
|
||||
= (Binding.name_of name = "A")
|
||||
| match _ = false;
|
||||
|
||||
@{assert} (exists match (Name_Space.dest_table docclass_tab))
|
||||
\<close>
|
||||
|
||||
text\<open>As a consequence of the theory of the \<^theory_text>\<open>doc_class\<close> \<open>A\<close>, the code-generator setup lets us
|
||||
evaluate statements such as: \<close>
|
||||
|
||||
value\<open> the(A.level (A.make 3 (Some 4) 5)) = 4\<close>
|
||||
|
||||
text\<open>And finally, as a consequence of the above semantic construction of \<^theory_text>\<open>doc_class\<close>'es, the internal
|
||||
text\<open>And further, as a consequence of the above semantic construction of \<^theory_text>\<open>doc_class\<close>'es, the internal
|
||||
\<open>\<lambda>\<close>-calculus representation of class instances looks as follows:\<close>
|
||||
|
||||
ML\<open>
|
||||
val tt = @{term \<open>the(A.level (A.make 3 (Some 4) 5))\<close>}
|
||||
@{term \<open>the(A.level (A.make 3 (Some 4) 5))\<close>};
|
||||
fun match (Const("Option.option.the",_) $
|
||||
(Const ("Conceptual.A.level",_) $
|
||||
(Const ("Conceptual.A.make", _) $ u $ v $ w))) = true
|
||||
|match _ = false;
|
||||
@{assert} (match @{term \<open>the(A.level (A.make 3 (Some 4) 5))\<close>})
|
||||
\<close>
|
||||
|
||||
text\<open>For the code-generation, we have the following access to values representing class instances:\<close>
|
||||
text\<open>And finally, via the code-generation, we have the following programmable
|
||||
access to values representing class instances:\<close>
|
||||
ML\<open>
|
||||
val A_make = @{code A.make};
|
||||
val zero = @{code "0::int"};
|
||||
|
@ -75,9 +115,9 @@ val add = @{code "(+) :: int \<Rightarrow> int \<Rightarrow> int"};
|
|||
A_make zero (SOME one) (add one one)
|
||||
\<close>
|
||||
|
||||
section\<open>Building up a conceptual class hierarchy:\<close>
|
||||
|
||||
subsection\<open>An independent class-tree root: \<close>
|
||||
|
||||
text\<open>An independent class-tree root: \<close>
|
||||
|
||||
doc_class B =
|
||||
level :: "int option"
|
||||
|
@ -89,9 +129,9 @@ text\<open>We may even use type-synonyms for class synonyms ...\<close>
|
|||
type_synonym XX = B
|
||||
|
||||
|
||||
subsection\<open>Examples of inheritance \<close>
|
||||
section\<open>Examples of inheritance \<close>
|
||||
|
||||
doc_class C = XX +
|
||||
doc_class C = B +
|
||||
z :: "A option" <= None (* A LINK, i.e. an attribute that has a type
|
||||
referring to a document class. Mathematical
|
||||
relations over document items can be modeled. *)
|
||||
|
@ -125,7 +165,7 @@ doc_class F =
|
|||
and br':: "r \<sigma> \<noteq> [] \<and> length(b' \<sigma>) \<ge> 3"
|
||||
and cr :: "properties \<sigma> \<noteq> []"
|
||||
|
||||
text\<open>The effect of the invariant declaration is to provide intern definitions for validation
|
||||
text\<open>The effect of the invariant declaration is to provide intern HOL definitions for validation
|
||||
functions of this invariant. They can be referenced as follows:\<close>
|
||||
thm br_inv_def
|
||||
thm br'_inv_def
|
||||
|
@ -133,7 +173,7 @@ thm cr_inv_def
|
|||
|
||||
term "\<lparr>F.tag_attribute = 5, properties = [], r = [], u = undefined, s = [], b = {}, b' = []\<rparr>"
|
||||
|
||||
term "br' (\<lparr>F.tag_attribute = 5, properties = [], r = [], u = undefined, s = [], b = {}, b' = []\<rparr>) "
|
||||
term "br'_inv (\<lparr>F.tag_attribute = 5, properties = [], r = [], u = undefined, s = [], b = {}, b' = []\<rparr>) "
|
||||
|
||||
text\<open>Now, we can use these definitions in order to generate code for these validation functions.
|
||||
Note, however, that not everything that we can write in an invariant (basically: HOL) is executable,
|
||||
|
@ -141,7 +181,7 @@ or even compilable by the code generator setup:\<close>
|
|||
|
||||
ML\<open> val cr_inv_code = @{code "cr_inv"} \<close> \<comment> \<open>works albeit thm is abstract ...\<close>
|
||||
text\<open>while in :\<close>
|
||||
(* ML\<open> val br_inv_code = @{code "br_inv"} \<close> \<comment>\<open>this does not work ...\<close> *)
|
||||
ML\<open> val br_inv_code = @{code "br_inv"} \<close> \<comment>\<open>this does not work ...\<close>
|
||||
|
||||
text\<open>... the compilation fails due to the fact that nothing prevents the user
|
||||
to define an infinite relation between \<^typ>\<open>A\<close> and \<^typ>\<open>C\<close>. However, the alternative
|
||||
|
@ -151,30 +191,31 @@ ML\<open> val br'_inv_code = @{code "br'_inv"} \<close> \<comment> \<open>does w
|
|||
|
||||
text\<open>... is compilable ...\<close>
|
||||
|
||||
|
||||
|
||||
doc_class G = C +
|
||||
g :: "thm" <= "@{thm \<open>HOL.refl\<close>}"
|
||||
g :: "thm" <= "@{thm \<open>HOL.refl\<close>}" (* warning overriding attribute expected*)
|
||||
|
||||
doc_class M =
|
||||
ok :: "unit"
|
||||
accepts "A ~~ \<lbrace>C || D\<rbrace>\<^sup>* ~~ \<lbrakk>F\<rbrakk>"
|
||||
|
||||
text\<open>The final class and item tables look like this:\<close>
|
||||
print_doc_classes
|
||||
print_doc_items
|
||||
|
||||
(*
|
||||
ML\<open> Document.state();\<close>
|
||||
ML\<open> Session.get_keywords(); (* this looks to be really session global. *)
|
||||
Outer_Syntax.command; \<close>
|
||||
ML\<open> Thy_Header.get_keywords @{theory};(* this looks to be really theory global. *) \<close>
|
||||
*)
|
||||
ML\<open>
|
||||
map fst (Name_Space.dest_table (DOF_core.get_onto_classes \<^context>));
|
||||
|
||||
open_monitor*[aaa::M]
|
||||
section*[test::A]\<open>Test and Validation\<close>
|
||||
term\<open>Conceptual.M.make\<close>
|
||||
text\<open>Defining some document elements to be referenced in later on in another theory: \<close>
|
||||
text*[sdf]\<open> Lorem ipsum @{thm refl}\<close>
|
||||
text*[ sdfg :: F] \<open> Lorem ipsum @{thm refl}\<close>
|
||||
text*[ xxxy ] \<open> Lorem ipsum @{F \<open>sdfg\<close>} rate @{thm refl}\<close>
|
||||
close_monitor*[aaa]
|
||||
let val class_ids_so_far = ["Conceptual.A", "Conceptual.B", "Conceptual.C", "Conceptual.D",
|
||||
"Conceptual.E", "Conceptual.F", "Conceptual.G", "Conceptual.M",
|
||||
"Isa_COL.float", "Isa_COL.frame", "Isa_COL.figure", "Isa_COL.chapter",
|
||||
"Isa_COL.listing", "Isa_COL.section", "Isa_COL.paragraph",
|
||||
"Isa_COL.subsection", "Isa_COL.text_element", "Isa_COL.subsubsection"]
|
||||
val docclass_tab = map fst (Name_Space.dest_table (DOF_core.get_onto_classes \<^context>));
|
||||
in @{assert} (class_ids_so_far = docclass_tab) end\<close>
|
||||
|
||||
end
|
||||
section\<open>For Test and Validation\<close>
|
||||
|
||||
text*[sdf] \<open> Lorem ipsum ... \<close> \<comment> \<open>anonymous reference\<close>
|
||||
text*[sdfg :: F] \<open> Lorem ipsum ...\<close> \<comment> \<open>some F instance \<close>
|
||||
|
||||
end
|
|
@ -0,0 +1,23 @@
|
|||
session "Isabelle_DOF-Ontologies" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof]
|
||||
directories
|
||||
"CC_v3_1_R5"
|
||||
"Conceptual"
|
||||
"small_math"
|
||||
"CENELEC_50128"
|
||||
theories
|
||||
"document_setup"
|
||||
"document_templates"
|
||||
"CC_v3_1_R5/CC_v3_1_R5"
|
||||
"CC_v3_1_R5/CC_terminology"
|
||||
"Conceptual/Conceptual"
|
||||
"small_math/small_math"
|
||||
"CENELEC_50128/CENELEC_50128"
|
||||
"CENELEC_50128/CENELEC_50128_Documentation"
|
||||
document_files
|
||||
"root.bib"
|
||||
"lstisadof-manual.sty"
|
||||
"preamble.tex"
|
||||
"figures/antiquotations-PIDE.png"
|
||||
"figures/srac-as-es-application.png"
|
||||
"figures/srac-definition.png"
|
|
@ -0,0 +1,65 @@
|
|||
%% Copyright (c) University of Exeter
|
||||
%% University of Paris-Saclay
|
||||
%%
|
||||
%% License:
|
||||
%% This program can be redistributed and/or modified under the terms
|
||||
%% of the LaTeX Project Public License Distributed from CTAN
|
||||
%% archives in directory macros/latex/base/lppl.txt; either
|
||||
%% version 1.3c of the License, or (at your option) any later version.
|
||||
%% OR
|
||||
%% The 2-clause BSD-style license.
|
||||
%%
|
||||
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
|
||||
|
||||
%% Warning: Do Not Edit!
|
||||
%% =====================
|
||||
%% This is the root file for the Isabelle/DOF using the scrartcl class.
|
||||
%%
|
||||
%% All customization and/or additional packages should be added to the file
|
||||
%% preamble.tex.
|
||||
|
||||
\RequirePackage{ifvtex}
|
||||
\documentclass[16x9,9pt]{beamer}
|
||||
\PassOptionsToPackage{force}{DOF-scholarly_paper}
|
||||
\title{No Title Given}
|
||||
\usepackage{DOF-core}
|
||||
|
||||
\usepackage{textcomp}
|
||||
\bibliographystyle{abbrvnat}
|
||||
\RequirePackage{subcaption}
|
||||
|
||||
\providecommand{\institute}[1]{}%
|
||||
\providecommand{\inst}[1]{}%
|
||||
\providecommand{\orcidID}[1]{}%
|
||||
\providecommand{\email}[1]{}%
|
||||
|
||||
|
||||
\usepackage[numbers, sort&compress, sectionbib]{natbib}
|
||||
|
||||
\usepackage{hyperref}
|
||||
\setcounter{tocdepth}{3}
|
||||
\hypersetup{%
|
||||
bookmarksdepth=3
|
||||
,pdfpagelabels
|
||||
,pageanchor=true
|
||||
,bookmarksnumbered
|
||||
,plainpages=false
|
||||
} % more detailed digital TOC (aka bookmarks)
|
||||
\sloppy
|
||||
\allowdisplaybreaks[4]
|
||||
|
||||
\newenvironment{frontmatter}{}{}
|
||||
\raggedbottom
|
||||
\begin{document}
|
||||
\begin{frame}
|
||||
\maketitle
|
||||
\end{frame}
|
||||
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
|
||||
% optional bibliography
|
||||
\IfFileExists{root.bib}{{\bibliography{root}}}{}
|
||||
\end{document}
|
||||
|
||||
%%% Local Variables:
|
||||
%%% mode: latex
|
||||
%%% TeX-master: t
|
||||
%%% End:
|
|
@ -0,0 +1,65 @@
|
|||
%% Copyright (c) University of Exeter
|
||||
%% University of Paris-Saclay
|
||||
%%
|
||||
%% License:
|
||||
%% This program can be redistributed and/or modified under the terms
|
||||
%% of the LaTeX Project Public License Distributed from CTAN
|
||||
%% archives in directory macros/latex/base/lppl.txt; either
|
||||
%% version 1.3c of the License, or (at your option) any later version.
|
||||
%% OR
|
||||
%% The 2-clause BSD-style license.
|
||||
%%
|
||||
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
|
||||
|
||||
%% Warning: Do Not Edit!
|
||||
%% =====================
|
||||
%% This is the root file for the Isabelle/DOF using the scrartcl class.
|
||||
%%
|
||||
%% All customization and/or additional packages should be added to the file
|
||||
%% preamble.tex.
|
||||
|
||||
\RequirePackage{ifvtex}
|
||||
\documentclass[]{beamer}
|
||||
\PassOptionsToPackage{force}{DOF-scholarly_paper}
|
||||
\title{No Title Given}
|
||||
\usepackage{beamerposter}
|
||||
\usepackage{DOF-core}
|
||||
|
||||
\usepackage{textcomp}
|
||||
\bibliographystyle{abbrvnat}
|
||||
\RequirePackage{subcaption}
|
||||
|
||||
\providecommand{\institute}[1]{}%
|
||||
\providecommand{\inst}[1]{}%
|
||||
\providecommand{\orcidID}[1]{}%
|
||||
\providecommand{\email}[1]{}%
|
||||
|
||||
|
||||
\usepackage[numbers, sort&compress, sectionbib]{natbib}
|
||||
|
||||
\usepackage{hyperref}
|
||||
\setcounter{tocdepth}{3}
|
||||
\hypersetup{%
|
||||
bookmarksdepth=3
|
||||
,pdfpagelabels
|
||||
,pageanchor=true
|
||||
,bookmarksnumbered
|
||||
,plainpages=false
|
||||
} % more detailed digital TOC (aka bookmarks)
|
||||
\sloppy
|
||||
\allowdisplaybreaks[4]
|
||||
|
||||
\newenvironment{frontmatter}{}{}
|
||||
\raggedbottom
|
||||
\begin{document}
|
||||
\begin{frame}[fragile]
|
||||
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
|
||||
% optional bibliography
|
||||
\IfFileExists{root.bib}{{\bibliography{root}}}{}
|
||||
\end{frame}
|
||||
\end{document}
|
||||
|
||||
%%% Local Variables:
|
||||
%%% mode: latex
|
||||
%%% TeX-master: t
|
||||
%%% End:
|
|
@ -23,6 +23,7 @@
|
|||
%% preamble.tex.
|
||||
|
||||
\documentclass[submission,copyright,creativecommons]{eptcs}
|
||||
\title{No Title Given}
|
||||
|
||||
\usepackage{DOF-core}
|
||||
\bibliographystyle{eptcs}% the mandatory bibstyle
|
||||
|
@ -66,7 +67,7 @@
|
|||
|
||||
\begin{document}
|
||||
\maketitle
|
||||
\input{session}
|
||||
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
|
||||
% optional bibliography
|
||||
\IfFileExists{root.bib}{%
|
||||
\bibliography{root}
|
|
@ -19,14 +19,13 @@
|
|||
%% you need to download lipics.cls from
|
||||
%% https://www.dagstuhl.de/en/publications/lipics/instructions-for-authors/
|
||||
%% and add it manually to the praemble.tex and the ROOT file.
|
||||
%% Moreover, the option "document_comment_latex=true" needs to be set
|
||||
%% in the ROOT file.
|
||||
%%
|
||||
%% All customization and/or additional packages should be added to the file
|
||||
%% preamble.tex.
|
||||
|
||||
\documentclass[a4paper,UKenglish,cleveref, autoref,thm-restate]{lipics-v2021}
|
||||
\bibliographystyle{plainurl}% the mandatory bibstyle
|
||||
\title{No Title Given}
|
||||
\usepackage[numbers, sort&compress, sectionbib]{natbib}
|
||||
|
||||
\usepackage{DOF-core}
|
||||
|
@ -64,7 +63,7 @@
|
|||
|
||||
\begin{document}
|
||||
\maketitle
|
||||
\input{session}
|
||||
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
|
||||
% optional bibliography
|
||||
\IfFileExists{root.bib}{%
|
||||
\small
|
|
@ -0,0 +1,67 @@
|
|||
%% Copyright (c) 2019-2022 University of Exeter
|
||||
%% 2018-2022 University of Paris-Saclay
|
||||
%% 2018-2019 The University of Sheffield
|
||||
%%
|
||||
%% License:
|
||||
%% This program can be redistributed and/or modified under the terms
|
||||
%% of the LaTeX Project Public License Distributed from CTAN
|
||||
%% archives in directory macros/latex/base/lppl.txt; either
|
||||
%% version 1.3c of the License, or (at your option) any later version.
|
||||
%% OR
|
||||
%% The 2-clause BSD-style license.
|
||||
%%
|
||||
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
|
||||
|
||||
%% Warning: Do Not Edit!
|
||||
%% =====================
|
||||
%% This is the root file for the Isabelle/DOF using the scrartcl class.
|
||||
%%
|
||||
%% All customization and/or additional packages should be added to the file
|
||||
%% preamble.tex.
|
||||
|
||||
\documentclass[iicol]{sn-jnl}
|
||||
\PassOptionsToPackage{force}{DOF-scholarly_paper}
|
||||
\title{No Title Given}
|
||||
\usepackage{DOF-core}
|
||||
\bibliographystyle{sn-basic}
|
||||
\let\proof\relax
|
||||
\let\endproof\relax
|
||||
\newcommand{\institute}[1]{}
|
||||
\usepackage{manyfoot}
|
||||
\usepackage{DOF-core}
|
||||
\setcounter{tocdepth}{3}
|
||||
\hypersetup{%
|
||||
bookmarksdepth=3
|
||||
,pdfpagelabels
|
||||
,pageanchor=true
|
||||
,bookmarksnumbered
|
||||
,plainpages=false
|
||||
} % more detailed digital TOC (aka bookmarks)
|
||||
\sloppy
|
||||
\allowdisplaybreaks[4]
|
||||
|
||||
\usepackage{subcaption}
|
||||
\usepackage[size=footnotesize]{caption}
|
||||
|
||||
\let\DOFauthor\relax
|
||||
\begin{document}
|
||||
\selectlanguage{USenglish}%
|
||||
\renewcommand{\bibname}{References}%
|
||||
\renewcommand{\figurename}{Fig.}
|
||||
\renewcommand{\abstractname}{Abstract.}
|
||||
\renewcommand{\subsubsectionautorefname}{Sect.}
|
||||
\renewcommand{\subsectionautorefname}{Sect.}
|
||||
\renewcommand{\sectionautorefname}{Sect.}
|
||||
\renewcommand{\figureautorefname}{Fig.}
|
||||
\newcommand{\lstnumberautorefname}{Line}
|
||||
|
||||
\maketitle
|
||||
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
|
||||
% optional bibliography
|
||||
\IfFileExists{root.bib}{{\bibliography{root}}}{}
|
||||
\end{document}
|
||||
|
||||
%%% Local Variables:
|
||||
%%% mode: latex
|
||||
%%% TeX-master: t
|
||||
%%% End:
|
|
@ -23,6 +23,7 @@
|
|||
\RequirePackage{fix-cm}
|
||||
\documentclass[]{svjour3}
|
||||
|
||||
\title{No Title Given}
|
||||
\usepackage{DOF-core}
|
||||
\usepackage{mathptmx}
|
||||
\bibliographystyle{abbrvnat}
|
||||
|
@ -40,8 +41,6 @@
|
|||
} % more detailed digital TOC (aka bookmarks)
|
||||
\sloppy
|
||||
\allowdisplaybreaks[4]
|
||||
\usepackage[caption]{subfig}
|
||||
\usepackage[size=footnotesize]{caption}
|
||||
|
||||
\begin{document}
|
||||
\selectlanguage{USenglish}%
|
||||
|
@ -52,12 +51,10 @@
|
|||
\renewcommand{\subsectionautorefname}{Sect.}
|
||||
\renewcommand{\sectionautorefname}{Sect.}
|
||||
\renewcommand{\figureautorefname}{Fig.}
|
||||
\newcommand{\subtableautorefname}{\tableautorefname}
|
||||
\newcommand{\subfigureautorefname}{\figureautorefname}
|
||||
\newcommand{\lstnumberautorefname}{Line}
|
||||
|
||||
\maketitle
|
||||
\input{session}
|
||||
\IfFileExists{dof_session.tex}{\input{dof_session}}{\input{session}}
|
||||
% optional bibliography
|
||||
\IfFileExists{root.bib}{{\bibliography{root}}}{}
|
||||
\end{document}
|
After Width: | Height: | Size: 96 KiB |
After Width: | Height: | Size: 67 KiB |
After Width: | Height: | Size: 50 KiB |
|
@ -0,0 +1,327 @@
|
|||
%% Copyright (C) 2018 The University of Sheffield
|
||||
%% 2018-2021 The University of Paris-Saclay
|
||||
%% 2019-2021 The University of Exeter
|
||||
%%
|
||||
%% License:
|
||||
%% This program can be redistributed and/or modified under the terms
|
||||
%% of the LaTeX Project Public License Distributed from CTAN
|
||||
%% archives in directory macros/latex/base/lppl.txt; either
|
||||
%% version 1.3c of the License, or (at your option) any later version.
|
||||
%% OR
|
||||
%% The 2-clause BSD-style license.
|
||||
%%
|
||||
%% SPDX-License-Identifier: LPPL-1.3c+ OR BSD-2-Clause
|
||||
\usepackage{listings}
|
||||
\usepackage{listingsutf8}
|
||||
\usepackage{tikz}
|
||||
\usepackage[many]{tcolorbox}
|
||||
\tcbuselibrary{listings}
|
||||
\tcbuselibrary{skins}
|
||||
\usepackage{xstring}
|
||||
|
||||
\definecolor{OliveGreen} {cmyk}{0.64,0,0.95,0.40}
|
||||
\definecolor{BrickRed} {cmyk}{0,0.89,0.94,0.28}
|
||||
\definecolor{Blue} {cmyk}{1,1,0,0}
|
||||
\definecolor{CornflowerBlue}{cmyk}{0.65,0.13,0,0}
|
||||
|
||||
|
||||
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
%% <antiquotations>
|
||||
%% Hack: re-defining tag types for supporting highlighting of antiquotations
|
||||
\gdef\lst@tagtypes{s}
|
||||
\gdef\lst@TagKey#1#2{%
|
||||
\lst@Delim\lst@tagstyle #2\relax
|
||||
{Tag}\lst@tagtypes #1%
|
||||
{\lst@BeginTag\lst@EndTag}%
|
||||
\@@end\@empty{}}
|
||||
\lst@Key{tag}\relax{\lst@TagKey\@empty{#1}}
|
||||
\lst@Key{tagstyle}{}{\def\lst@tagstyle{#1}}
|
||||
\lst@AddToHook{EmptyStyle}{\let\lst@tagstyle\@empty}
|
||||
\gdef\lst@BeginTag{%
|
||||
\lst@DelimOpen
|
||||
\lst@ifextags\else
|
||||
{\let\lst@ifkeywords\iftrue
|
||||
\lst@ifmarkfirstintag \lst@firstintagtrue \fi}}
|
||||
\lst@AddToHookExe{ExcludeDelims}{\let\lst@ifextags\iffalse}
|
||||
\gdef\lst@EndTag{\lst@DelimClose\lst@ifextags\else}
|
||||
\lst@Key{usekeywordsintag}t[t]{\lstKV@SetIf{#1}\lst@ifusekeysintag}
|
||||
\lst@Key{markfirstintag}f[t]{\lstKV@SetIf{#1}\lst@ifmarkfirstintag}
|
||||
\gdef\lst@firstintagtrue{\global\let\lst@iffirstintag\iftrue}
|
||||
\global\let\lst@iffirstintag\iffalse
|
||||
\lst@AddToHook{PostOutput}{\lst@tagresetfirst}
|
||||
\lst@AddToHook{Output}
|
||||
{\gdef\lst@tagresetfirst{\global\let\lst@iffirstintag\iffalse}}
|
||||
\lst@AddToHook{OutputOther}{\gdef\lst@tagresetfirst{}}
|
||||
\lst@AddToHook{Output}
|
||||
{\ifnum\lst@mode=\lst@tagmode
|
||||
\lst@iffirstintag \let\lst@thestyle\lst@gkeywords@sty \fi
|
||||
\lst@ifusekeysintag\else \let\lst@thestyle\lst@gkeywords@sty\fi
|
||||
\fi}
|
||||
\lst@NewMode\lst@tagmode
|
||||
\gdef\lst@Tag@s#1#2\@empty#3#4#5{%
|
||||
\lst@CArg #1\relax\lst@DefDelimB {}{}%
|
||||
{\ifnum\lst@mode=\lst@tagmode \expandafter\@gobblethree \fi}%
|
||||
#3\lst@tagmode{#5}%
|
||||
\lst@CArg #2\relax\lst@DefDelimE {}{}{}#4\lst@tagmode}%
|
||||
\gdef\lst@BeginCDATA#1\@empty{%
|
||||
\lst@TrackNewLines \lst@PrintToken
|
||||
\lst@EnterMode\lst@GPmode{}\let\lst@ifmode\iffalse
|
||||
\lst@mode\lst@tagmode #1\lst@mode\lst@GPmode\relax\lst@modetrue}
|
||||
%
|
||||
\def\beginlstdelim#1#2#3%
|
||||
{%
|
||||
\def\endlstdelim{\texttt{\textbf{\color{black!60}#2}}\egroup}%
|
||||
\ttfamily\textbf{\color{black!60}#1}\bgroup\rmfamily\color{#3}\aftergroup\endlstdelim%
|
||||
}
|
||||
%% </antiquotations>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
%% <isar>
|
||||
\providecolor{isar}{named}{blue}
|
||||
\renewcommand{\isacommand}[1]{\textcolor{OliveGreen!60}{\ttfamily\bfseries #1}}
|
||||
\newcommand{\inlineisarbox}[1]{#1}
|
||||
\NewTColorBox[]{isarbox}{}{
|
||||
,boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback=white!90!isar
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{isar!60!black}
|
||||
,sharp corners
|
||||
%,before skip balanced=0.5\baselineskip plus 2pt % works only with Tex Live 2020 and later
|
||||
,enlarge top by=0mm
|
||||
,enhanced
|
||||
,overlay={\node[draw,fill=isar!60!black,xshift=0pt,anchor=north
|
||||
east,font=\bfseries\footnotesize\color{white}]
|
||||
at (frame.north east) {Isar};}
|
||||
}
|
||||
%% </isar>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
%% <out>
|
||||
\providecolor{out}{named}{green}
|
||||
\newtcblisting{out}[1][]{%
|
||||
listing only%
|
||||
,boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback=white!90!out
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{out!60!black}
|
||||
,sharp corners
|
||||
% ,before skip=10pt
|
||||
% ,after skip=10pt
|
||||
,enlarge top by=0mm
|
||||
,enhanced
|
||||
,overlay={\node[draw,fill=out!60!black,xshift=0pt,anchor=north
|
||||
east,font=\bfseries\footnotesize\color{white}]
|
||||
at (frame.north east) {Document};}
|
||||
,listing options={
|
||||
breakatwhitespace=true
|
||||
,columns=flexible%
|
||||
,basicstyle=\small\rmfamily
|
||||
,mathescape
|
||||
,#1
|
||||
}
|
||||
}%
|
||||
%% </out>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
%% <sml>
|
||||
\lstloadlanguages{ML}
|
||||
\providecolor{sml}{named}{red}
|
||||
\lstdefinestyle{sml}{
|
||||
,escapechar=ë%
|
||||
,basicstyle=\ttfamily%
|
||||
,commentstyle=\itshape%
|
||||
,keywordstyle=\bfseries\color{CornflowerBlue}%
|
||||
,ndkeywordstyle=\color{green}%
|
||||
,language=ML
|
||||
% ,literate={%
|
||||
% {<@>}{@}1%
|
||||
% }
|
||||
,keywordstyle=[6]{\itshape}%
|
||||
,morekeywords=[6]{args_type}%
|
||||
,tag=**[s]{@\{}{\}}%
|
||||
,tagstyle=\color{CornflowerBlue}%
|
||||
,markfirstintag=true%
|
||||
}%
|
||||
\def\inlinesml{\lstinline[style=sml,breaklines=true,breakatwhitespace=true]}
|
||||
\newtcblisting{sml}[1][]{%
|
||||
listing only%
|
||||
,boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback=white!90!sml
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{sml!60!black}
|
||||
,sharp corners
|
||||
% ,before skip=10pt
|
||||
% ,after skip=10pt
|
||||
,enlarge top by=0mm
|
||||
,enhanced
|
||||
,overlay={\node[draw,fill=sml!60!black,xshift=0pt,anchor=north
|
||||
east,font=\bfseries\footnotesize\color{white}]
|
||||
at (frame.north east) {SML};}
|
||||
,listing options={
|
||||
style=sml
|
||||
,columns=flexible%
|
||||
,basicstyle=\small\ttfamily
|
||||
,#1
|
||||
}
|
||||
}%
|
||||
%% </sml>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
%% <latex>
|
||||
\lstloadlanguages{TeX}
|
||||
\providecolor{ltx}{named}{yellow}
|
||||
\lstdefinestyle{lltx}{language=[AlLaTeX]TeX,
|
||||
,basicstyle=\ttfamily%
|
||||
,showspaces=false%
|
||||
,escapechar=ë
|
||||
,showlines=false%
|
||||
,morekeywords={newisadof}
|
||||
% ,keywordstyle=\bfseries%
|
||||
% Defining 2-keywords
|
||||
,keywordstyle=[1]{\color{BrickRed!60}\bfseries}%
|
||||
% Defining 3-keywords
|
||||
,keywordstyle=[2]{\color{OliveGreen!60}\bfseries}%
|
||||
% Defining 4-keywords
|
||||
,keywordstyle=[3]{\color{black!60}\bfseries}%
|
||||
% Defining 5-keywords
|
||||
,keywordstyle=[4]{\color{Blue!70}\bfseries}%
|
||||
% Defining 6-keywords
|
||||
,keywordstyle=[5]{\itshape}%
|
||||
%
|
||||
}
|
||||
\lstdefinestyle{ltx}{style=lltx,
|
||||
basicstyle=\ttfamily\small}%
|
||||
\def\inlineltx{\lstinline[style=ltx, breaklines=true,columns=fullflexible]}
|
||||
% see
|
||||
% https://tex.stackexchange.com/questions/247643/problem-with-tcblisting-first-listed-latex-command-is-missing
|
||||
\NewTCBListing{ltx}{ !O{} }{%
|
||||
listing only%
|
||||
,boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback=white!90!ltx
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{ltx!60!black}
|
||||
,sharp corners
|
||||
% ,before skip=10pt
|
||||
% ,after skip=10pt
|
||||
,enlarge top by=0mm
|
||||
,enhanced
|
||||
,overlay={\node[draw,fill=ltx!60!black,xshift=0pt,anchor=north
|
||||
east,font=\bfseries\footnotesize\color{white}]
|
||||
at (frame.north east) {\LaTeX};}
|
||||
,listing options={
|
||||
style=lltx,
|
||||
,columns=flexible%
|
||||
,basicstyle=\small\ttfamily
|
||||
,#1
|
||||
}
|
||||
}%
|
||||
%% </latex>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
%% <bash>
|
||||
\providecolor{bash}{named}{black}
|
||||
\lstloadlanguages{bash}
|
||||
\lstdefinestyle{bash}{%
|
||||
language=bash
|
||||
,escapechar=ë
|
||||
,basicstyle=\ttfamily%
|
||||
,showspaces=false%
|
||||
,showlines=false%
|
||||
,columns=flexible%
|
||||
% ,keywordstyle=\bfseries%
|
||||
% Defining 2-keywords
|
||||
,keywordstyle=[1]{\color{BrickRed!60}\bfseries}%
|
||||
% Defining 3-keywords
|
||||
,keywordstyle=[2]{\color{OliveGreen!60}\bfseries}%
|
||||
% Defining 4-keywords
|
||||
,keywordstyle=[3]{\color{black!60}\bfseries}%
|
||||
% Defining 5-keywords
|
||||
,keywordstyle=[4]{\color{Blue!80}\bfseries}%
|
||||
,alsoletter={*,-,:,~,/}
|
||||
,morekeywords=[4]{}%
|
||||
% Defining 6-keywords
|
||||
,keywordstyle=[5]{\itshape}%
|
||||
%
|
||||
}
|
||||
\def\inlinebash{\lstinline[style=bash, breaklines=true,columns=fullflexible]}
|
||||
\newcommand\@isabsolutepath[3]{%
|
||||
\StrLeft{#1}{1}[\firstchar]%
|
||||
\IfStrEq{\firstchar}{/}{#2}{#3}%
|
||||
}
|
||||
|
||||
\newcommand{\@homeprefix}[1]{%
|
||||
\ifthenelse{\equal{#1}{}}{\textasciitilde}{\textasciitilde/}%
|
||||
}
|
||||
|
||||
\newcommand{\prompt}[1]{%
|
||||
\color{Blue!80}\textbf{\texttt{%
|
||||
achim@logicalhacking:{\@isabsolutepath{#1}{#1}{\@homeprefix{#1}#1}}\$}}%
|
||||
}
|
||||
\newtcblisting{bash}[1][]{%
|
||||
listing only%
|
||||
,boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback=white!90!bash
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{bash!60!black}
|
||||
,sharp corners
|
||||
% ,before skip=10pt
|
||||
% ,after skip=10pt
|
||||
,enlarge top by=0mm
|
||||
,enhanced
|
||||
,overlay={\node[draw,fill=bash!60!black,xshift=0pt,anchor=north
|
||||
east,font=\bfseries\footnotesize\color{white}]
|
||||
at (frame.north east) {Bash};}
|
||||
,listing options={
|
||||
style=bash
|
||||
,columns=flexible%
|
||||
,breaklines=true%
|
||||
,prebreak=\mbox{\space\textbackslash}%
|
||||
,basicstyle=\small\ttfamily%
|
||||
,#1
|
||||
}
|
||||
}%
|
||||
%% </bash>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
%% <config>
|
||||
\providecolor{config}{named}{gray}
|
||||
\newtcblisting{config}[2][]{%
|
||||
listing only%
|
||||
,boxrule=0pt
|
||||
,boxsep=0pt
|
||||
,colback=white!90!config
|
||||
,enhanced jigsaw
|
||||
,borderline west={2pt}{0pt}{config!60!black}
|
||||
,sharp corners
|
||||
% ,before skip=10pt
|
||||
% ,after skip=10pt
|
||||
,enlarge top by=0mm
|
||||
,enhanced
|
||||
,overlay={\node[draw,fill=config!60!black,xshift=0pt,anchor=north
|
||||
east,font=\bfseries\footnotesize\color{white}]
|
||||
at (frame.north east) {#2};}
|
||||
,listing options={
|
||||
breakatwhitespace=true
|
||||
,columns=flexible%
|
||||
,basicstyle=\small\ttfamily
|
||||
,mathescape
|
||||
,#1
|
||||
}
|
||||
}%
|
||||
%% </config>
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
|
||||
|
|
@ -0,0 +1,4 @@
|
|||
\usepackage{dirtree}
|
||||
\renewcommand*\DTstylecomment{\ttfamily\itshape}
|
||||
|
||||
\usepackage{lstisadof-manual}
|
|
@ -451,7 +451,7 @@
|
|||
journal = {Archive of Formal Proofs},
|
||||
month = may,
|
||||
year = 2010,
|
||||
note = {\url{http://isa-afp.org/entries/Regular-Sets.html}, Formal
|
||||
note = {\url{https://isa-afp.org/entries/Regular-Sets.html}, Formal
|
||||
proof development},
|
||||
issn = {2150-914x}
|
||||
}
|
||||
|
@ -462,7 +462,7 @@
|
|||
journal = {Archive of Formal Proofs},
|
||||
month = mar,
|
||||
year = 2004,
|
||||
note = {\url{http://isa-afp.org/entries/Functional-Automata.html},
|
||||
note = {\url{https://isa-afp.org/entries/Functional-Automata.html},
|
||||
Formal proof development},
|
||||
issn = {2150-914x}
|
||||
}
|
|
@ -0,0 +1,20 @@
|
|||
(*<*)
|
||||
theory "document_setup"
|
||||
imports
|
||||
"Isabelle_DOF.technical_report"
|
||||
"Isabelle_DOF-Ontologies.CENELEC_50128"
|
||||
"Isabelle_DOF-Ontologies.CC_terminology"
|
||||
begin
|
||||
|
||||
use_template "scrreprt-modern"
|
||||
use_ontology "Isabelle_DOF.technical_report" and "Isabelle_DOF-Ontologies.CENELEC_50128"
|
||||
and "Isabelle_DOF-Ontologies.CC_terminology"
|
||||
|
||||
(*>*)
|
||||
|
||||
title*[title::title] \<open>Isabelle/DOF\<close>
|
||||
subtitle*[subtitle::subtitle]\<open>Ontologies\<close>
|
||||
|
||||
(*<*)
|
||||
end
|
||||
(*>*)
|
|
@ -0,0 +1,32 @@
|
|||
(*************************************************************************
|
||||
* Copyright (C)
|
||||
* 2019 The University of Exeter
|
||||
* 2018-2019 The University of Paris-Saclay
|
||||
* 2018 The University of Sheffield
|
||||
*
|
||||
* License:
|
||||
* This program can be redistributed and/or modified under the terms
|
||||
* of the 2-clause BSD-style license.
|
||||
*
|
||||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*************************************************************************)
|
||||
|
||||
theory
|
||||
"document_templates"
|
||||
imports
|
||||
"Isabelle_DOF.Isa_DOF"
|
||||
begin
|
||||
|
||||
define_template "./document-templates/root-eptcs-UNSUPPORTED.tex"
|
||||
"Unsupported template for the EPTCS class. Not for general use."
|
||||
define_template "./document-templates/root-lipics-v2021-UNSUPPORTED.tex"
|
||||
"Unsupported template for LIPICS (v2021). Not for general use."
|
||||
define_template "./document-templates/root-svjour3-UNSUPPORTED.tex"
|
||||
"Unsupported template for SVJOUR. Not for general use."
|
||||
define_template "./document-templates/root-sn-article-UNSUPPORTED.tex"
|
||||
"Unsupported template for Springer Nature's template. Not for general use."
|
||||
define_template "./document-templates/root-beamer-UNSUPPORTED.tex"
|
||||
"Unsupported template for presentations. Not for general use."
|
||||
define_template "./document-templates/root-beamerposter-UNSUPPORTED.tex"
|
||||
"Unsupported template for poster. Not for general use."
|
||||
end
|
|
@ -11,10 +11,11 @@
|
|||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*************************************************************************)
|
||||
|
||||
section\<open>An example ontology for a math paper\<close>
|
||||
chapter\<open>An example ontology for a math paper\<close>
|
||||
|
||||
theory small_math
|
||||
imports "Isabelle_DOF.Isa_COL"
|
||||
imports
|
||||
"Isabelle_DOF.Isa_COL"
|
||||
begin
|
||||
|
||||
doc_class title =
|
||||
|
@ -64,18 +65,25 @@ doc_class result = technical +
|
|||
|
||||
|
||||
|
||||
ML\<open>fun check_invariant_invariant oid {is_monitor:bool} ctxt =
|
||||
let val kind_term = AttributeAccess.compute_attr_access ctxt "kind" oid @{here} @{here}
|
||||
val property_termS = AttributeAccess.compute_attr_access ctxt "property" oid @{here} @{here}
|
||||
ML\<open>
|
||||
fn thy =>
|
||||
let fun check_invariant_invariant oid {is_monitor:bool} ctxt =
|
||||
let val kind_term = ISA_core.compute_attr_access ctxt "kind" oid NONE @{here}
|
||||
val property_termS = ISA_core.compute_attr_access ctxt "property" oid NONE @{here}
|
||||
val tS = HOLogic.dest_list property_termS
|
||||
in case kind_term of
|
||||
@{term "proof"} => if not(null tS) then true
|
||||
else error("class class invariant violation")
|
||||
| _ => false
|
||||
end
|
||||
val cid = "result"
|
||||
val cid_long = DOF_core.get_onto_class_name_global cid thy
|
||||
val binding = DOF_core.binding_from_onto_class_pos cid thy
|
||||
in DOF_core.add_ml_invariant binding
|
||||
(DOF_core.make_ml_invariant (check_invariant_invariant, cid_long)) thy end
|
||||
\<close>
|
||||
|
||||
setup\<open>DOF_core.update_class_invariant "small_math.result" check_invariant_invariant\<close>
|
||||
(*setup\<open>DOF_core.add_ml_invariant "small_math.result" check_invariant_invariant\<close>*)
|
||||
|
||||
|
||||
doc_class example = technical +
|
||||
|
@ -85,7 +93,7 @@ doc_class "conclusion" = text_section +
|
|||
establish :: "(contribution_claim \<times> result) set"
|
||||
|
||||
text\<open> Besides subtyping, there is another relation between
|
||||
doc_classes: a class can be a \<^emph>\<open>monitor\<close> to other ones,
|
||||
doc\_classes: a class can be a \<^emph>\<open>monitor\<close> to other ones,
|
||||
which is expressed by occurrence in the where clause.
|
||||
While sub-classing refers to data-inheritance of attributes,
|
||||
a monitor captures structural constraints -- the order --
|
||||
|
@ -137,10 +145,10 @@ fun dest_option _ (Const (@{const_name "None"}, _)) = NONE
|
|||
|
||||
in
|
||||
|
||||
fun check ctxt cidS mon_id pos =
|
||||
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos @{here}
|
||||
fun check ctxt cidS mon_id pos_opt =
|
||||
let val trace = AttributeAccess.compute_trace_ML ctxt mon_id pos_opt @{here}
|
||||
val groups = partition (Context.proof_of ctxt) cidS trace
|
||||
fun get_level_raw oid = AttributeAccess.compute_attr_access ctxt "level" oid @{here} @{here};
|
||||
fun get_level_raw oid = ISA_core.compute_attr_access ctxt "level" oid NONE @{here};
|
||||
fun get_level oid = dest_option (snd o HOLogic.dest_number) (get_level_raw (oid));
|
||||
fun check_level_hd a = case (get_level (snd a)) of
|
||||
NONE => error("Invariant violation: leading section" ^ snd a ^
|
||||
|
@ -164,10 +172,17 @@ end
|
|||
end
|
||||
\<close>
|
||||
|
||||
setup\<open> let val cidS = ["small_math.introduction","small_math.technical", "small_math.conclusion"];
|
||||
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid @{here};
|
||||
setup\<open>
|
||||
fn thy =>
|
||||
let val cidS = ["small_math.introduction","small_math.technical", "small_math.conclusion"];
|
||||
fun body moni_oid _ ctxt = (Small_Math_trace_invariant.check ctxt cidS moni_oid NONE;
|
||||
true)
|
||||
in DOF_core.update_class_invariant "small_math.article" body end\<close>
|
||||
val ctxt = Proof_Context.init_global thy
|
||||
val cid = "article"
|
||||
val cid_long = DOF_core.get_onto_class_name_global cid thy
|
||||
val binding = DOF_core.binding_from_onto_class_pos cid thy
|
||||
in DOF_core.add_ml_invariant binding (DOF_core.make_ml_invariant (body, cid_long)) thy end
|
||||
\<close>
|
||||
|
||||
end
|
||||
|
|
@ -0,0 +1,12 @@
|
|||
theory Fun_Function
|
||||
imports "Isabelle_DOF-Proofs.Very_Deep_DOF"
|
||||
keywords "function*" "termination*" :: thy_goal_defn and
|
||||
"fun*" :: thy_defn
|
||||
begin
|
||||
|
||||
ML_file "specification.ML"
|
||||
ML_file "function.ML"
|
||||
ML_file "fun.ML"
|
||||
|
||||
|
||||
end
|
|
@ -0,0 +1,10 @@
|
|||
session "Isabelle_DOF-Proofs" (proofs) = "HOL-Proofs" +
|
||||
options [document = false, record_proofs = 2, parallel_limit = 500, document_build = dof]
|
||||
sessions
|
||||
"Isabelle_DOF"
|
||||
Metalogic_ProofChecker
|
||||
theories
|
||||
Isabelle_DOF.ontologies
|
||||
Isabelle_DOF.Isa_DOF
|
||||
Very_Deep_DOF
|
||||
Reification_Test
|
|
@ -0,0 +1,19 @@
|
|||
theory Very_Deep_DOF
|
||||
imports "Isabelle_DOF-Proofs.Very_Deep_Interpretation"
|
||||
|
||||
begin
|
||||
|
||||
(* tests *)
|
||||
term "@{typ ''int => int''}"
|
||||
term "@{term ''Bound 0''}"
|
||||
term "@{thm ''refl''}"
|
||||
term "@{docitem ''<doc_ref>''}"
|
||||
ML\<open> @{term "@{docitem ''<doc_ref>''}"}\<close>
|
||||
|
||||
term "@{typ \<open>int \<Rightarrow> int\<close>}"
|
||||
term "@{term \<open>\<forall>x. P x \<longrightarrow> Q\<close>}"
|
||||
term "@{thm \<open>refl\<close>}"
|
||||
term "@{docitem \<open>doc_ref\<close>}"
|
||||
ML\<open> @{term "@{docitem \<open>doc_ref\<close>}"}\<close>
|
||||
|
||||
end
|
|
@ -0,0 +1,251 @@
|
|||
theory Very_Deep_Interpretation
|
||||
imports "Isabelle_DOF.Isa_COL"
|
||||
Metalogic_ProofChecker.ProofTerm
|
||||
|
||||
begin
|
||||
|
||||
subsection\<open> Syntax \<close>
|
||||
|
||||
\<comment> \<open>and others in the future : file, http, thy, ...\<close>
|
||||
|
||||
(* Delete shallow interpretation notations (mixfixes) of the term anti-quotations,
|
||||
so we can use them for the deep interpretation *)
|
||||
no_notation "Isabelle_DOF_typ" ("@{typ _}")
|
||||
no_notation "Isabelle_DOF_term" ("@{term _}")
|
||||
no_notation "Isabelle_DOF_thm" ("@{thm _}")
|
||||
no_notation "Isabelle_DOF_file" ("@{file _}")
|
||||
no_notation "Isabelle_DOF_thy" ("@{thy _}")
|
||||
no_notation "Isabelle_DOF_docitem" ("@{docitem _}")
|
||||
no_notation "Isabelle_DOF_docitem_attr" ("@{docitemattr (_) :: (_)}")
|
||||
no_notation "Isabelle_DOF_trace_attribute" ("@{trace'_-attribute _}")
|
||||
|
||||
consts Isabelle_DOF_typ :: "string \<Rightarrow> typ" ("@{typ _}")
|
||||
consts Isabelle_DOF_term :: "string \<Rightarrow> term" ("@{term _}")
|
||||
datatype "thm" = Isabelle_DOF_thm string ("@{thm _}") | Thm_content ("proof":proofterm)
|
||||
datatype "thms_of" = Isabelle_DOF_thms_of string ("@{thms-of _}")
|
||||
datatype "file" = Isabelle_DOF_file string ("@{file _}")
|
||||
datatype "thy" = Isabelle_DOF_thy string ("@{thy _}")
|
||||
consts Isabelle_DOF_docitem :: "string \<Rightarrow> 'a" ("@{docitem _}")
|
||||
datatype "docitem_attr" = Isabelle_DOF_docitem_attr string string ("@{docitemattr (_) :: (_)}")
|
||||
consts Isabelle_DOF_trace_attribute :: "string \<Rightarrow> (string * string) list" ("@{trace'_-attribute _}")
|
||||
|
||||
subsection\<open> Semantics \<close>
|
||||
|
||||
ML\<open>
|
||||
structure Meta_ISA_core =
|
||||
struct
|
||||
|
||||
fun ML_isa_check_trace_attribute thy (term, _, pos) s =
|
||||
let
|
||||
val oid = (HOLogic.dest_string term
|
||||
handle TERM(_,[t]) => error ("wrong term format: must be string constant: "
|
||||
^ Syntax.string_of_term_global thy t ))
|
||||
val _ = DOF_core.get_instance_global oid thy
|
||||
in SOME term end
|
||||
|
||||
fun reify_typ (Type (s, typ_list)) =
|
||||
\<^Const>\<open>Ty\<close> $ HOLogic.mk_literal s $ HOLogic.mk_list \<^Type>\<open>typ\<close> (map reify_typ typ_list)
|
||||
| reify_typ (TFree (name, sort)) =
|
||||
\<^Const>\<open>Tv\<close> $(\<^Const>\<open>Free\<close> $ HOLogic.mk_literal name)
|
||||
$ (HOLogic.mk_set \<^typ>\<open>class\<close> (map HOLogic.mk_literal sort))
|
||||
| reify_typ (TVar (indexname, sort)) =
|
||||
let val (name, index_value) = indexname
|
||||
in \<^Const>\<open>Tv\<close>
|
||||
$ (\<^Const>\<open>Var\<close>
|
||||
$ HOLogic.mk_prod (HOLogic.mk_literal name, HOLogic.mk_number \<^Type>\<open>int\<close> index_value))
|
||||
$ (HOLogic.mk_set \<^typ>\<open>class\<close> (map HOLogic.mk_literal sort)) end
|
||||
|
||||
fun ML_isa_elaborate_typ (thy:theory) _ _ term_option _ =
|
||||
case term_option of
|
||||
NONE => error("Wrong term option. You must use a defined term")
|
||||
| SOME term => let
|
||||
val typ_name = HOLogic.dest_string term
|
||||
val typ = Syntax.read_typ_global thy typ_name
|
||||
in reify_typ typ end
|
||||
|
||||
fun reify_term (Const (name, typ)) =\<^Const>\<open>Ct\<close> $ HOLogic.mk_literal name $ reify_typ typ
|
||||
| reify_term (Free (name, typ)) =
|
||||
\<^Const>\<open>Fv\<close> $ (\<^Const>\<open>Free\<close> $ HOLogic.mk_literal name) $ reify_typ typ
|
||||
| reify_term (Var (indexname, typ)) =
|
||||
let val (name, index_value) = indexname
|
||||
in \<^Const>\<open>Fv\<close>
|
||||
$ (\<^Const>\<open>Var\<close>
|
||||
$ HOLogic.mk_prod (HOLogic.mk_literal name, HOLogic.mk_number \<^Type>\<open>int\<close> index_value))
|
||||
$ reify_typ typ end
|
||||
| reify_term (Bound i) = \<^Const>\<open>Bv\<close> $ HOLogic.mk_nat i
|
||||
| reify_term (Abs (_, typ, term)) = \<^Const>\<open>Abs\<close> $ reify_typ typ $ reify_term term
|
||||
| reify_term (Term.$ (t1, t2)) = \<^Const>\<open>App\<close> $ reify_term t1 $ reify_term t2
|
||||
|
||||
fun ML_isa_elaborate_term (thy:theory) _ _ term_option pos =
|
||||
case term_option of
|
||||
NONE => error("Wrong term option. You must use a defined term")
|
||||
| SOME term => let
|
||||
val term_name = HOLogic.dest_string term
|
||||
val term = Syntax.read_term_global thy term_name
|
||||
val term' = DOF_core.transduce_term_global {mk_elaboration=true} (term,pos) thy
|
||||
val value = Value_Command.value (Proof_Context.init_global thy) term'
|
||||
in reify_term value end
|
||||
|
||||
(*fun ML_isa_elaborate_term' (thy:theory) _ _ term_option pos =
|
||||
case term_option of
|
||||
NONE => error("Wrong term option. You must use a defined term")
|
||||
| SOME term => let
|
||||
val term_name = HOLogic.dest_string term
|
||||
val term = Syntax.read_term_global thy term_name
|
||||
val term' = ML_isa_elaborate_term (thy:theory) _ pos term_option _
|
||||
in reify_term term end*)
|
||||
|
||||
fun reify_proofterm (PBound i) =\<^Const>\<open>PBound\<close> $ (HOLogic.mk_nat i)
|
||||
| reify_proofterm (Abst (_, typ_option, proof)) =
|
||||
\<^Const>\<open>Abst\<close> $ reify_typ (the typ_option) $ reify_proofterm proof
|
||||
| reify_proofterm (AbsP (_, term_option, proof)) =
|
||||
\<^Const>\<open>AbsP\<close> $ reify_term (the term_option) $ reify_proofterm proof
|
||||
| reify_proofterm (op % (proof, term_option)) =
|
||||
\<^Const>\<open>Appt\<close> $ reify_proofterm proof $ reify_term (the term_option)
|
||||
| reify_proofterm (op %% (proof1, proof2)) =
|
||||
\<^Const>\<open>AppP\<close> $ reify_proofterm proof1 $ reify_proofterm proof2
|
||||
| reify_proofterm (Hyp term) = \<^Const>\<open>Hyp\<close> $ (reify_term term)
|
||||
| reify_proofterm (PAxm (_, term, typ_list_option)) =
|
||||
let
|
||||
val tvars = rev (Term.add_tvars term [])
|
||||
val meta_tvars = map (fn ((name, index_value), sort) =>
|
||||
HOLogic.mk_prod
|
||||
(\<^Const>\<open>Var\<close>
|
||||
$ HOLogic.mk_prod
|
||||
(HOLogic.mk_literal name, HOLogic.mk_number \<^Type>\<open>int\<close> index_value)
|
||||
, HOLogic.mk_set \<^typ>\<open>class\<close> (map HOLogic.mk_literal sort))) tvars
|
||||
val typ_list = case typ_list_option of
|
||||
NONE => meta_tvars |> map (K dummyT)
|
||||
| SOME T => T
|
||||
val meta_typ_list =
|
||||
HOLogic.mk_list @{typ "tyinst"} (map2 (fn x => fn y => HOLogic.mk_prod (x, y))
|
||||
meta_tvars (map reify_typ (typ_list)))
|
||||
in \<^Const>\<open>PAxm\<close> $ reify_term term $ meta_typ_list end
|
||||
| reify_proofterm (PClass (typ, class)) =
|
||||
\<^Const>\<open>OfClass\<close> $ reify_typ typ $ HOLogic.mk_literal class
|
||||
| reify_proofterm (PThm ({prop = prop, types = types, ...}, _)) =
|
||||
let
|
||||
val tvars = rev (Term.add_tvars prop [])
|
||||
val meta_tvars = map (fn ((name, index_value), sort) =>
|
||||
HOLogic.mk_prod
|
||||
(\<^Const>\<open>Var\<close>
|
||||
$ HOLogic.mk_prod
|
||||
(HOLogic.mk_literal name, HOLogic.mk_number \<^Type>\<open>int\<close> index_value)
|
||||
, HOLogic.mk_set \<^typ>\<open>class\<close> (map HOLogic.mk_literal sort))) tvars
|
||||
val meta_typ_list =
|
||||
HOLogic.mk_list \<^typ>\<open>tyinst\<close> (map2 (fn x => fn y => HOLogic.mk_prod (x, y))
|
||||
meta_tvars (map reify_typ (the types)))
|
||||
in \<^Const>\<open>PAxm\<close> $ reify_term prop $ meta_typ_list end
|
||||
|
||||
fun ML_isa_elaborate_thm (thy:theory) _ _ term_option pos =
|
||||
case term_option of
|
||||
NONE => ISA_core.err ("Malformed term annotation") pos
|
||||
| SOME term =>
|
||||
let
|
||||
val thm_name = HOLogic.dest_string term
|
||||
(*val _ = writeln ("In ML_isa_elaborate_thm thm_name: " ^ \<^make_string> thm_name)*)
|
||||
val thm = Proof_Context.get_thm (Proof_Context.init_global thy) thm_name
|
||||
(*val _ = writeln ("In ML_isa_elaborate_thm thm: " ^ \<^make_string> thm)*)
|
||||
val body = Proofterm.strip_thm_body (Thm.proof_body_of thm);
|
||||
val prf = Proofterm.proof_of body;
|
||||
(* Proof_Syntax.standard_proof_of reconstructs the proof and seems to rewrite
|
||||
the option arguments (with a value NONE) of the proof datatype constructors,
|
||||
at least for PAxm, with "SOME (typ/term)",
|
||||
allowing us the use the projection function "the".
|
||||
Maybe the function can deal with
|
||||
all the option types of the proof datatype constructors *)
|
||||
val proof = Proof_Syntax.standard_proof_of
|
||||
{full = true, expand_name = Thm.expand_name thm} thm
|
||||
(*val _ = writeln ("In ML_isa_elaborate_thm proof: " ^ \<^make_string> proof)*)
|
||||
(* After a small discussion with Simon Roßkopf, It seems preferable to use
|
||||
Thm.reconstruct_proof_of instead of Proof_Syntax.standard_proof_of
|
||||
whose operation is not well known.
|
||||
Thm.reconstruct_proof_of seems sufficient to have a reifiable PAxm
|
||||
in the metalogic. *)
|
||||
val proof' = Thm.reconstruct_proof_of thm
|
||||
(*in \<^Const>\<open>Thm_content\<close> $ reify_proofterm prf end*)
|
||||
(*in \<^Const>\<open>Thm_content\<close> $ reify_proofterm proof end*)
|
||||
in \<^Const>\<open>Thm_content\<close> $ reify_proofterm proof end
|
||||
|
||||
|
||||
fun ML_isa_elaborate_thms_of (thy:theory) _ _ term_option pos =
|
||||
case term_option of
|
||||
NONE => ISA_core.err ("Malformed term annotation") pos
|
||||
| SOME term =>
|
||||
let
|
||||
val thm_name = HOLogic.dest_string term
|
||||
val thm = Proof_Context.get_thm (Proof_Context.init_global thy) thm_name
|
||||
val body = Proofterm.strip_thm_body (Thm.proof_body_of thm)
|
||||
val all_thms_name = Proofterm.fold_body_thms (fn {name, ...} => insert (op =) name) [body] []
|
||||
(*val all_thms = map (Proof_Context.get_thm (Proof_Context.init_global thy)) all_thms_name*)
|
||||
(*val all_proofs = map (Proof_Syntax.standard_proof_of
|
||||
{full = true, expand_name = Thm.expand_name thm}) all_thms*)
|
||||
(*in HOLogic.mk_list \<^Type>\<open>thm\<close> (map (fn proof => \<^Const>\<open>Thm_content\<close> $ reify_proofterm proof) all_proofs) end*)
|
||||
in HOLogic.mk_list \<^typ>\<open>string\<close> (map HOLogic.mk_string all_thms_name) end
|
||||
|
||||
fun ML_isa_elaborate_trace_attribute (thy:theory) _ _ term_option pos =
|
||||
case term_option of
|
||||
NONE => ISA_core.err ("Malformed term annotation") pos
|
||||
| SOME term =>
|
||||
let
|
||||
val oid = HOLogic.dest_string term
|
||||
val traces = ISA_core.compute_attr_access (Context.Theory thy) "trace" oid NONE pos
|
||||
fun conv (\<^Const>\<open>Pair \<^typ>\<open>doc_class rexp\<close> \<^typ>\<open>string\<close>\<close>
|
||||
$ (\<^Const>\<open>Atom \<^typ>\<open>doc_class\<close>\<close> $ (\<^Const>\<open>mk\<close> $ s)) $ S) =
|
||||
let val s' = DOF_core.get_onto_class_name_global (HOLogic.dest_string s) thy
|
||||
in \<^Const>\<open>Pair \<^typ>\<open>string\<close> \<^typ>\<open>string\<close>\<close> $ HOLogic.mk_string s' $ S end
|
||||
val traces' = map conv (HOLogic.dest_list traces)
|
||||
in HOLogic.mk_list \<^Type>\<open>prod \<^typ>\<open>string\<close> \<^typ>\<open>string\<close>\<close> traces' end
|
||||
|
||||
end; (* struct *)
|
||||
|
||||
\<close>
|
||||
|
||||
ML\<open>
|
||||
val ty1 = Meta_ISA_core.reify_typ @{typ "int"}
|
||||
val ty2 = Meta_ISA_core.reify_typ @{typ "int \<Rightarrow> bool"}
|
||||
val ty3 = Meta_ISA_core.reify_typ @{typ "prop"}
|
||||
val ty4 = Meta_ISA_core.reify_typ @{typ "'a list"}
|
||||
\<close>
|
||||
|
||||
ML\<open>
|
||||
val t1 = Meta_ISA_core.reify_term @{term "1::int"}
|
||||
val t2 = Meta_ISA_core.reify_term @{term "\<lambda>x. x = 1"}
|
||||
val t3 = Meta_ISA_core.reify_term @{term "[2, 3::int]"}
|
||||
\<close>
|
||||
|
||||
subsection\<open> Isar - Setup\<close>
|
||||
(* Isa_transformers declaration for Isabelle_DOF term anti-quotations (typ, term, thm, etc.).
|
||||
They must be declared in the same theory file as the one of the declaration
|
||||
of Isabelle_DOF term anti-quotations !!! *)
|
||||
setup\<open>
|
||||
[(\<^type_name>\<open>thm\<close>, ISA_core.ML_isa_check_thm, Meta_ISA_core.ML_isa_elaborate_thm)
|
||||
, (\<^type_name>\<open>thms_of\<close>, ISA_core.ML_isa_check_thm, Meta_ISA_core.ML_isa_elaborate_thms_of)
|
||||
, (\<^type_name>\<open>file\<close>, ISA_core.ML_isa_check_file, ISA_core.ML_isa_elaborate_generic)]
|
||||
|> fold (fn (n, check, elaborate) => fn thy =>
|
||||
let val ns = Sign.tsig_of thy |> Type.type_space
|
||||
val name = n
|
||||
val {pos, ...} = Name_Space.the_entry ns name
|
||||
val bname = Long_Name.base_name name
|
||||
val binding = Binding.make (bname, pos)
|
||||
|> Binding.prefix_name DOF_core.ISA_prefix
|
||||
|> Binding.prefix false bname
|
||||
in DOF_core.add_isa_transformer binding ((check, elaborate) |> DOF_core.make_isa_transformer) thy
|
||||
end)
|
||||
#>
|
||||
([(\<^const_name>\<open>Isabelle_DOF_typ\<close>, ISA_core.ML_isa_check_typ, Meta_ISA_core.ML_isa_elaborate_typ)
|
||||
,(\<^const_name>\<open>Isabelle_DOF_term\<close>, ISA_core.ML_isa_check_term, Meta_ISA_core.ML_isa_elaborate_term)
|
||||
,(\<^const_name>\<open>Isabelle_DOF_docitem\<close>,
|
||||
ISA_core.ML_isa_check_docitem, ISA_core.ML_isa_elaborate_generic)
|
||||
,(\<^const_name>\<open>Isabelle_DOF_trace_attribute\<close>,
|
||||
ISA_core.ML_isa_check_trace_attribute, ISA_core.ML_isa_elaborate_trace_attribute)]
|
||||
|> fold (fn (n, check, elaborate) => fn thy =>
|
||||
let val ns = Sign.consts_of thy |> Consts.space_of
|
||||
val name = n
|
||||
val {pos, ...} = Name_Space.the_entry ns name
|
||||
val bname = Long_Name.base_name name
|
||||
val binding = Binding.make (bname, pos)
|
||||
in DOF_core.add_isa_transformer binding ((check, elaborate) |> DOF_core.make_isa_transformer) thy
|
||||
end))
|
||||
\<close>
|
||||
end
|
|
@ -0,0 +1,179 @@
|
|||
(* Title: HOL/Tools/Function/fun.ML
|
||||
Author: Alexander Krauss, TU Muenchen
|
||||
|
||||
Command "fun": Function definitions with pattern splitting/completion
|
||||
and automated termination proofs.
|
||||
*)
|
||||
|
||||
signature FUNCTION_FUN =
|
||||
sig
|
||||
val fun_config : Function_Common.function_config
|
||||
val add_fun : (binding * typ option * mixfix) list ->
|
||||
Specification.multi_specs -> Function_Common.function_config ->
|
||||
local_theory -> Proof.context
|
||||
val add_fun_cmd : (binding * string option * mixfix) list ->
|
||||
Specification.multi_specs_cmd -> Function_Common.function_config ->
|
||||
bool -> local_theory -> Proof.context
|
||||
end
|
||||
|
||||
structure Function_Fun : FUNCTION_FUN =
|
||||
struct
|
||||
|
||||
open Function_Lib
|
||||
open Function_Common
|
||||
|
||||
|
||||
fun check_pats ctxt geq =
|
||||
let
|
||||
fun err str = error (cat_lines ["Malformed definition:",
|
||||
str ^ " not allowed in sequential mode.",
|
||||
Syntax.string_of_term ctxt geq])
|
||||
|
||||
fun check_constr_pattern (Bound _) = ()
|
||||
| check_constr_pattern t =
|
||||
let
|
||||
val (hd, args) = strip_comb t
|
||||
in
|
||||
(case hd of
|
||||
Const (hd_s, hd_T) =>
|
||||
(case body_type hd_T of
|
||||
Type (Tname, _) =>
|
||||
(case Ctr_Sugar.ctr_sugar_of ctxt Tname of
|
||||
SOME {ctrs, ...} => exists (fn Const (s, _) => s = hd_s) ctrs
|
||||
| NONE => false)
|
||||
| _ => false)
|
||||
| _ => false) orelse err "Non-constructor pattern";
|
||||
map check_constr_pattern args;
|
||||
()
|
||||
end
|
||||
|
||||
val (_, qs, gs, args, _) = split_def ctxt (K true) geq
|
||||
|
||||
val _ = if not (null gs) then err "Conditional equations" else ()
|
||||
val _ = map check_constr_pattern args
|
||||
|
||||
(* just count occurrences to check linearity *)
|
||||
val _ = if fold (fold_aterms (fn Bound _ => Integer.add 1 | _ => I)) args 0 > length qs
|
||||
then err "Nonlinear patterns" else ()
|
||||
in
|
||||
()
|
||||
end
|
||||
|
||||
fun mk_catchall fixes arity_of =
|
||||
let
|
||||
fun mk_eqn ((fname, fT), _) =
|
||||
let
|
||||
val n = arity_of fname
|
||||
val (argTs, rT) = chop n (binder_types fT)
|
||||
|> apsnd (fn Ts => Ts ---> body_type fT)
|
||||
|
||||
val qs = map Free (Name.invent Name.context "a" n ~~ argTs)
|
||||
in
|
||||
HOLogic.mk_eq(list_comb (Free (fname, fT), qs),
|
||||
Const (\<^const_name>\<open>undefined\<close>, rT))
|
||||
|> HOLogic.mk_Trueprop
|
||||
|> fold_rev Logic.all qs
|
||||
end
|
||||
in
|
||||
map mk_eqn fixes
|
||||
end
|
||||
|
||||
fun add_catchall ctxt fixes spec =
|
||||
let val fqgars = map (split_def ctxt (K true)) spec
|
||||
val arity_of = map (fn (fname,_,_,args,_) => (fname, length args)) fqgars
|
||||
|> AList.lookup (op =) #> the
|
||||
in
|
||||
spec @ mk_catchall fixes arity_of
|
||||
end
|
||||
|
||||
fun further_checks ctxt origs tss =
|
||||
let
|
||||
fun fail_redundant t =
|
||||
error (cat_lines ["Equation is redundant (covered by preceding clauses):", Syntax.string_of_term ctxt t])
|
||||
fun warn_missing strs =
|
||||
warning (cat_lines ("Missing patterns in function definition:" :: strs))
|
||||
|
||||
val (tss', added) = chop (length origs) tss
|
||||
|
||||
val _ = case chop 3 (flat added) of
|
||||
([], []) => ()
|
||||
| (eqs, []) => warn_missing (map (Syntax.string_of_term ctxt) eqs)
|
||||
| (eqs, rest) => warn_missing (map (Syntax.string_of_term ctxt) eqs
|
||||
@ ["(" ^ string_of_int (length rest) ^ " more)"])
|
||||
|
||||
val _ = (origs ~~ tss')
|
||||
|> map (fn (t, ts) => if null ts then fail_redundant t else ())
|
||||
in
|
||||
()
|
||||
end
|
||||
|
||||
fun sequential_preproc (config as FunctionConfig {sequential, ...}) ctxt fixes spec =
|
||||
if sequential then
|
||||
let
|
||||
val (bnds, eqss) = split_list spec
|
||||
|
||||
val eqs = map the_single eqss
|
||||
|
||||
val feqs = eqs
|
||||
|> tap (check_defs ctxt fixes) (* Standard checks *)
|
||||
|> tap (map (check_pats ctxt)) (* More checks for sequential mode *)
|
||||
|
||||
val compleqs = add_catchall ctxt fixes feqs (* Completion *)
|
||||
|
||||
val spliteqs = Function_Split.split_all_equations ctxt compleqs
|
||||
|> tap (further_checks ctxt feqs)
|
||||
|
||||
fun restore_spec thms =
|
||||
bnds ~~ take (length bnds) (unflat spliteqs thms)
|
||||
|
||||
val spliteqs' = flat (take (length bnds) spliteqs)
|
||||
val fnames = map (fst o fst) fixes
|
||||
val indices = map (fn eq => find_index (curry op = (fname_of eq)) fnames) spliteqs'
|
||||
|
||||
fun sort xs = partition_list (fn i => fn (j,_) => i = j) 0 (length fnames - 1) (indices ~~ xs)
|
||||
|> map (map snd)
|
||||
|
||||
|
||||
val bnds' = bnds @ replicate (length spliteqs - length bnds) Binding.empty_atts
|
||||
|
||||
(* using theorem names for case name currently disabled *)
|
||||
val case_names = map_index (fn (i, (_, es)) => mk_case_names i "" (length es))
|
||||
(bnds' ~~ spliteqs) |> flat
|
||||
in
|
||||
(flat spliteqs, restore_spec, sort, case_names)
|
||||
end
|
||||
else
|
||||
Function_Common.empty_preproc check_defs config ctxt fixes spec
|
||||
|
||||
val _ = Theory.setup (Context.theory_map (Function_Common.set_preproc sequential_preproc))
|
||||
|
||||
|
||||
|
||||
val fun_config = FunctionConfig { sequential=true, default=NONE,
|
||||
domintros=false, partials=false }
|
||||
|
||||
fun gen_add_fun add lthy =
|
||||
let
|
||||
fun pat_completeness_auto ctxt =
|
||||
Pat_Completeness.pat_completeness_tac ctxt 1
|
||||
THEN auto_tac ctxt
|
||||
fun prove_termination lthy =
|
||||
Function.prove_termination NONE (Function_Common.termination_prover_tac false lthy) lthy
|
||||
in
|
||||
lthy
|
||||
|> add pat_completeness_auto |> snd
|
||||
|> prove_termination |> snd
|
||||
end
|
||||
|
||||
fun add_fun a b c = gen_add_fun (Function.add_function a b c)
|
||||
fun add_fun_cmd a b c int = gen_add_fun (fn tac => Function.add_function_cmd a b c tac int)
|
||||
|
||||
|
||||
|
||||
val _ =
|
||||
Outer_Syntax.local_theory' \<^command_keyword>\<open>fun*\<close>
|
||||
"define general recursive functions (short version)"
|
||||
(function_parser fun_config
|
||||
>> (fn (config, (fixes, specs)) => add_fun_cmd fixes specs config))
|
||||
|
||||
end
|
|
@ -0,0 +1,288 @@
|
|||
(* Title: HOL/Tools/Function/function.ML
|
||||
Author: Alexander Krauss, TU Muenchen
|
||||
|
||||
Main entry points to the function package.
|
||||
*)
|
||||
|
||||
signature FUNCTION =
|
||||
sig
|
||||
type info = Function_Common.info
|
||||
|
||||
val add_function: (binding * typ option * mixfix) list ->
|
||||
Specification.multi_specs -> Function_Common.function_config ->
|
||||
(Proof.context -> tactic) -> local_theory -> info * local_theory
|
||||
|
||||
val add_function_cmd: (binding * string option * mixfix) list ->
|
||||
Specification.multi_specs_cmd -> Function_Common.function_config ->
|
||||
(Proof.context -> tactic) -> bool -> local_theory -> info * local_theory
|
||||
|
||||
val function: (binding * typ option * mixfix) list ->
|
||||
Specification.multi_specs -> Function_Common.function_config ->
|
||||
local_theory -> Proof.state
|
||||
|
||||
val function_cmd: (binding * string option * mixfix) list ->
|
||||
Specification.multi_specs_cmd -> Function_Common.function_config ->
|
||||
bool -> local_theory -> Proof.state
|
||||
|
||||
val prove_termination: term option -> tactic -> local_theory ->
|
||||
info * local_theory
|
||||
val prove_termination_cmd: string option -> tactic -> local_theory ->
|
||||
info * local_theory
|
||||
|
||||
val termination : term option -> local_theory -> Proof.state
|
||||
val termination_cmd : string option -> local_theory -> Proof.state
|
||||
|
||||
val get_congs : Proof.context -> thm list
|
||||
|
||||
val get_info : Proof.context -> term -> info
|
||||
end
|
||||
|
||||
|
||||
structure Function : FUNCTION =
|
||||
struct
|
||||
|
||||
open Function_Lib
|
||||
open Function_Common
|
||||
|
||||
val simp_attribs =
|
||||
@{attributes [simp, nitpick_simp]}
|
||||
|
||||
val psimp_attribs =
|
||||
@{attributes [nitpick_psimp]}
|
||||
|
||||
fun note_derived (a, atts) (fname, thms) =
|
||||
Local_Theory.note ((derived_name fname a, atts), thms) #> apfst snd
|
||||
|
||||
fun add_simps fnames post sort extra_qualify label mod_binding moreatts simps lthy =
|
||||
let
|
||||
val spec = post simps
|
||||
|> map (apfst (apsnd (fn ats => moreatts @ ats)))
|
||||
|> map (apfst (apfst extra_qualify))
|
||||
|
||||
val (saved_spec_simps, lthy') =
|
||||
fold_map Local_Theory.note spec lthy
|
||||
|
||||
val saved_simps = maps snd saved_spec_simps
|
||||
val simps_by_f = sort saved_simps
|
||||
|
||||
fun note fname simps =
|
||||
Local_Theory.note ((mod_binding (derived_name fname label), []), simps) #> snd
|
||||
in (saved_simps, fold2 note fnames simps_by_f lthy') end
|
||||
|
||||
fun prepare_function do_print prep fixspec eqns config lthy =
|
||||
let
|
||||
val ((fixes0, spec0), ctxt') = prep fixspec eqns lthy
|
||||
val fixes = map (apfst (apfst Binding.name_of)) fixes0
|
||||
val spec = map (fn (bnd, prop) => (bnd, [prop])) spec0
|
||||
val (eqs, post, sort_cont, cnames) = get_preproc lthy config ctxt' fixes spec
|
||||
|
||||
val fnames = map (fst o fst) fixes0
|
||||
val defname = Binding.conglomerate fnames;
|
||||
|
||||
val FunctionConfig {partials, default, ...} = config
|
||||
val _ =
|
||||
if is_some default
|
||||
then legacy_feature "\"function (default)\" -- use 'partial_function' instead"
|
||||
else ()
|
||||
|
||||
val ((goal_state, cont), lthy') =
|
||||
Function_Mutual.prepare_function_mutual config defname fixes0 eqs lthy
|
||||
|
||||
fun afterqed [[proof]] lthy1 =
|
||||
let
|
||||
val result = cont lthy1 (Thm.close_derivation \<^here> proof)
|
||||
val FunctionResult {fs, R, dom, psimps, simple_pinducts,
|
||||
termination, domintros, cases, ...} = result
|
||||
|
||||
val pelims = Function_Elims.mk_partial_elim_rules lthy1 result
|
||||
|
||||
val concealed_partial = if partials then I else Binding.concealed
|
||||
|
||||
val addsmps = add_simps fnames post sort_cont
|
||||
|
||||
val (((((psimps', [pinducts']), [termination']), cases'), pelims'), lthy2) =
|
||||
lthy1
|
||||
|> addsmps (concealed_partial o Binding.qualify false "partial")
|
||||
"psimps" concealed_partial psimp_attribs psimps
|
||||
||>> Local_Theory.notes [((concealed_partial (derived_name defname "pinduct"), []),
|
||||
simple_pinducts |> map (fn th => ([th],
|
||||
[Attrib.case_names cnames, Attrib.consumes (1 - Thm.nprems_of th)] @
|
||||
@{attributes [induct pred]})))]
|
||||
||>> (apfst snd o
|
||||
Local_Theory.note
|
||||
((Binding.concealed (derived_name defname "termination"), []), [termination]))
|
||||
||>> fold_map (note_derived ("cases", [Attrib.case_names cnames]))
|
||||
(fnames ~~ map single cases)
|
||||
||>> fold_map (note_derived ("pelims", [Attrib.consumes 1, Attrib.constraints 1]))
|
||||
(fnames ~~ pelims)
|
||||
||> (case domintros of NONE => I | SOME thms =>
|
||||
Local_Theory.note ((derived_name defname "domintros", []), thms) #> snd)
|
||||
|
||||
val info =
|
||||
{ add_simps=addsmps, fnames=fnames, case_names=cnames, psimps=psimps',
|
||||
pinducts=snd pinducts', simps=NONE, inducts=NONE, termination=termination', totality=NONE,
|
||||
fs=fs, R=R, dom=dom, defname=defname, is_partial=true, cases=flat cases',
|
||||
pelims=pelims',elims=NONE}
|
||||
|
||||
val _ =
|
||||
Proof_Display.print_consts do_print (Position.thread_data ()) lthy2
|
||||
(K false) (map fst fixes)
|
||||
in
|
||||
(info,
|
||||
lthy2 |> Local_Theory.declaration {syntax = false, pervasive = false, pos = \<^here>}
|
||||
(fn phi => add_function_data (transform_function_data phi info)))
|
||||
end
|
||||
in
|
||||
((goal_state, afterqed), lthy')
|
||||
end
|
||||
|
||||
fun gen_add_function do_print prep fixspec eqns config tac lthy =
|
||||
let
|
||||
val ((goal_state, afterqed), lthy') =
|
||||
prepare_function do_print prep fixspec eqns config lthy
|
||||
val pattern_thm =
|
||||
case SINGLE (tac lthy') goal_state of
|
||||
NONE => error "pattern completeness and compatibility proof failed"
|
||||
| SOME st => Goal.finish lthy' st
|
||||
in
|
||||
lthy'
|
||||
|> afterqed [[pattern_thm]]
|
||||
end
|
||||
|
||||
val add_function = gen_add_function false Specification.check_multi_specs
|
||||
fun add_function_cmd a b c d int = gen_add_function int Specification.read_multi_specs a b c d
|
||||
|
||||
fun gen_function do_print prep fixspec eqns config lthy =
|
||||
let
|
||||
val ((goal_state, afterqed), lthy') =
|
||||
prepare_function do_print prep fixspec eqns config lthy
|
||||
in
|
||||
lthy'
|
||||
|> Proof.theorem NONE (snd oo afterqed) [[(Logic.unprotect (Thm.concl_of goal_state), [])]]
|
||||
|> Proof.refine_singleton (Method.primitive_text (K (K goal_state)))
|
||||
end
|
||||
|
||||
val function = gen_function false Specification.check_multi_specs
|
||||
fun function_cmd a b c int = gen_function int Specification.read_multi_specs a b c
|
||||
|
||||
fun prepare_termination_proof prep_binding prep_term raw_term_opt lthy =
|
||||
let
|
||||
val term_opt = Option.map (prep_term lthy) raw_term_opt
|
||||
val info =
|
||||
(case term_opt of
|
||||
SOME t =>
|
||||
(case import_function_data t lthy of
|
||||
SOME info => info
|
||||
| NONE => error ("Not a function: " ^ quote (Syntax.string_of_term lthy t)))
|
||||
| NONE =>
|
||||
(case import_last_function lthy of
|
||||
SOME info => info
|
||||
| NONE => error "Not a function"))
|
||||
|
||||
val { termination, fs, R, add_simps, case_names, psimps,
|
||||
pinducts, defname, fnames, cases, dom, pelims, ...} = info
|
||||
val domT = domain_type (fastype_of R)
|
||||
val goal = HOLogic.mk_Trueprop (HOLogic.mk_all ("x", domT, mk_acc domT R $ Free ("x", domT)))
|
||||
fun afterqed [[raw_totality]] lthy1 =
|
||||
let
|
||||
val totality = Thm.close_derivation \<^here> raw_totality
|
||||
val remove_domain_condition =
|
||||
full_simplify (put_simpset HOL_basic_ss lthy1
|
||||
addsimps [totality, @{thm True_implies_equals}])
|
||||
val tsimps = map remove_domain_condition psimps
|
||||
val tinduct = map remove_domain_condition pinducts
|
||||
val telims = map (map remove_domain_condition) pelims
|
||||
in
|
||||
lthy1
|
||||
|> add_simps prep_binding "simps" prep_binding simp_attribs tsimps
|
||||
||> Code.declare_default_eqns (map (rpair true) tsimps)
|
||||
||>> Local_Theory.note
|
||||
((prep_binding (derived_name defname "induct"), [Attrib.case_names case_names]), tinduct)
|
||||
||>> fold_map (note_derived ("elims", [Attrib.consumes 1, Attrib.constraints 1]))
|
||||
(map prep_binding fnames ~~ telims)
|
||||
|-> (fn ((simps,(_,inducts)), elims) => fn lthy2 =>
|
||||
let val info' = { is_partial=false, defname=defname, fnames=fnames, add_simps=add_simps,
|
||||
case_names=case_names, fs=fs, R=R, dom=dom, psimps=psimps, pinducts=pinducts,
|
||||
simps=SOME simps, inducts=SOME inducts, termination=termination, totality=SOME totality,
|
||||
cases=cases, pelims=pelims, elims=SOME elims}
|
||||
|> transform_function_data (Morphism.binding_morphism "" prep_binding)
|
||||
in
|
||||
(info',
|
||||
lthy2
|
||||
|> Local_Theory.declaration {syntax = false, pervasive = false, pos = \<^here>}
|
||||
(fn phi => add_function_data (transform_function_data phi info'))
|
||||
|> Spec_Rules.add Binding.empty Spec_Rules.equational_recdef fs tsimps)
|
||||
end)
|
||||
end
|
||||
in
|
||||
(goal, afterqed, termination)
|
||||
end
|
||||
|
||||
fun gen_prove_termination prep_term raw_term_opt tac lthy =
|
||||
let
|
||||
val (goal, afterqed, termination) =
|
||||
prepare_termination_proof I prep_term raw_term_opt lthy
|
||||
|
||||
val totality = Goal.prove lthy [] [] goal (K tac)
|
||||
in
|
||||
afterqed [[totality]] lthy
|
||||
end
|
||||
|
||||
val prove_termination = gen_prove_termination Syntax.check_term
|
||||
val prove_termination_cmd = gen_prove_termination Syntax.read_term
|
||||
|
||||
fun gen_termination prep_term raw_term_opt lthy =
|
||||
let
|
||||
val (goal, afterqed, termination) =
|
||||
prepare_termination_proof Binding.reset_pos prep_term raw_term_opt lthy
|
||||
in
|
||||
lthy
|
||||
|> Proof_Context.note_thms ""
|
||||
((Binding.empty, [Context_Rules.rule_del]), [([allI], [])]) |> snd
|
||||
|> Proof_Context.note_thms ""
|
||||
((Binding.empty, [Context_Rules.intro_bang (SOME 1)]), [([allI], [])]) |> snd
|
||||
|> Proof_Context.note_thms ""
|
||||
((Binding.name "termination", [Context_Rules.intro_bang (SOME 0)]),
|
||||
[([Goal.norm_result lthy termination], [])]) |> snd
|
||||
|> Proof.theorem NONE (snd oo afterqed) [[(goal, [])]]
|
||||
end
|
||||
|
||||
val termination = gen_termination Syntax.check_term
|
||||
val termination_cmd = gen_termination Syntax.read_term
|
||||
|
||||
|
||||
(* Datatype hook to declare datatype congs as "function_congs" *)
|
||||
|
||||
fun add_case_cong n thy =
|
||||
let
|
||||
val cong = #case_cong (Old_Datatype_Data.the_info thy n)
|
||||
|> safe_mk_meta_eq
|
||||
in
|
||||
Context.theory_map (Function_Context_Tree.add_function_cong cong) thy
|
||||
end
|
||||
|
||||
val _ = Theory.setup (Old_Datatype_Data.interpretation (K (fold add_case_cong)))
|
||||
|
||||
|
||||
(* get info *)
|
||||
|
||||
val get_congs = Function_Context_Tree.get_function_congs
|
||||
|
||||
fun get_info ctxt t = Function_Common.retrieve_function_data ctxt t
|
||||
|> the_single |> snd
|
||||
|
||||
|
||||
(* outer syntax *)
|
||||
|
||||
val _ =
|
||||
Outer_Syntax.local_theory_to_proof' \<^command_keyword>\<open>function*\<close>
|
||||
"define general recursive functions"
|
||||
(function_parser default_config
|
||||
>> (fn (config, (fixes, specs)) => function_cmd fixes specs config))
|
||||
|
||||
val _ =
|
||||
Outer_Syntax.local_theory_to_proof \<^command_keyword>\<open>termination*\<close>
|
||||
"prove termination of a recursive function"
|
||||
(Scan.option Parse.term >> termination_cmd)
|
||||
|
||||
end
|
|
@ -0,0 +1,451 @@
|
|||
(* Title: Pure/Isar/specification.ML
|
||||
Author: Makarius
|
||||
|
||||
Derived local theory specifications --- with type-inference and
|
||||
toplevel polymorphism.
|
||||
*)
|
||||
|
||||
signature SPECIFICATION =
|
||||
sig
|
||||
val read_props: string list -> (binding * string option * mixfix) list -> Proof.context ->
|
||||
term list * Proof.context
|
||||
val check_spec_open: (binding * typ option * mixfix) list ->
|
||||
(binding * typ option * mixfix) list -> term list -> term -> Proof.context ->
|
||||
((binding * typ option * mixfix) list * string list * (string -> Position.T list) * term) *
|
||||
Proof.context
|
||||
val read_spec_open: (binding * string option * mixfix) list ->
|
||||
(binding * string option * mixfix) list -> string list -> string -> Proof.context ->
|
||||
((binding * typ option * mixfix) list * string list * (string -> Position.T list) * term) *
|
||||
Proof.context
|
||||
type multi_specs =
|
||||
((Attrib.binding * term) * term list * (binding * typ option * mixfix) list) list
|
||||
type multi_specs_cmd =
|
||||
((Attrib.binding * string) * string list * (binding * string option * mixfix) list) list
|
||||
val check_multi_specs: (binding * typ option * mixfix) list -> multi_specs -> Proof.context ->
|
||||
(((binding * typ) * mixfix) list * (Attrib.binding * term) list) * Proof.context
|
||||
val read_multi_specs: (binding * string option * mixfix) list -> multi_specs_cmd -> Proof.context ->
|
||||
(((binding * typ) * mixfix) list * (Attrib.binding * term) list) * Proof.context
|
||||
val axiomatization: (binding * typ option * mixfix) list ->
|
||||
(binding * typ option * mixfix) list -> term list ->
|
||||
(Attrib.binding * term) list -> theory -> (term list * thm list) * theory
|
||||
val axiomatization_cmd: (binding * string option * mixfix) list ->
|
||||
(binding * string option * mixfix) list -> string list ->
|
||||
(Attrib.binding * string) list -> theory -> (term list * thm list) * theory
|
||||
val axiom: Attrib.binding * term -> theory -> thm * theory
|
||||
val definition: (binding * typ option * mixfix) option ->
|
||||
(binding * typ option * mixfix) list -> term list -> Attrib.binding * term ->
|
||||
local_theory -> (term * (string * thm)) * local_theory
|
||||
val definition_cmd: (binding * string option * mixfix) option ->
|
||||
(binding * string option * mixfix) list -> string list -> Attrib.binding * string ->
|
||||
bool -> local_theory -> (term * (string * thm)) * local_theory
|
||||
val abbreviation: Syntax.mode -> (binding * typ option * mixfix) option ->
|
||||
(binding * typ option * mixfix) list -> term -> bool -> local_theory -> local_theory
|
||||
val abbreviation_cmd: Syntax.mode -> (binding * string option * mixfix) option ->
|
||||
(binding * string option * mixfix) list -> string -> bool -> local_theory -> local_theory
|
||||
val alias: binding * string -> local_theory -> local_theory
|
||||
val alias_cmd: binding * (xstring * Position.T) -> local_theory -> local_theory
|
||||
val type_alias: binding * string -> local_theory -> local_theory
|
||||
val type_alias_cmd: binding * (xstring * Position.T) -> local_theory -> local_theory
|
||||
val theorems: string ->
|
||||
(Attrib.binding * Attrib.thms) list ->
|
||||
(binding * typ option * mixfix) list ->
|
||||
bool -> local_theory -> (string * thm list) list * local_theory
|
||||
val theorems_cmd: string ->
|
||||
(Attrib.binding * (Facts.ref * Token.src list) list) list ->
|
||||
(binding * string option * mixfix) list ->
|
||||
bool -> local_theory -> (string * thm list) list * local_theory
|
||||
val theorem: bool -> string -> Method.text option ->
|
||||
(thm list list -> local_theory -> local_theory) -> Attrib.binding ->
|
||||
string list -> Element.context_i list -> Element.statement_i ->
|
||||
bool -> local_theory -> Proof.state
|
||||
val theorem_cmd: bool -> string -> Method.text option ->
|
||||
(thm list list -> local_theory -> local_theory) -> Attrib.binding ->
|
||||
(xstring * Position.T) list -> Element.context list -> Element.statement ->
|
||||
bool -> local_theory -> Proof.state
|
||||
val schematic_theorem: bool -> string -> Method.text option ->
|
||||
(thm list list -> local_theory -> local_theory) -> Attrib.binding ->
|
||||
string list -> Element.context_i list -> Element.statement_i ->
|
||||
bool -> local_theory -> Proof.state
|
||||
val schematic_theorem_cmd: bool -> string -> Method.text option ->
|
||||
(thm list list -> local_theory -> local_theory) -> Attrib.binding ->
|
||||
(xstring * Position.T) list -> Element.context list -> Element.statement ->
|
||||
bool -> local_theory -> Proof.state
|
||||
end;
|
||||
|
||||
structure Specification: SPECIFICATION =
|
||||
struct
|
||||
|
||||
(* prepare propositions *)
|
||||
|
||||
fun read_props raw_props raw_fixes ctxt =
|
||||
let
|
||||
val (_, ctxt1) = ctxt |> Proof_Context.add_fixes_cmd raw_fixes;
|
||||
val props1 = map (Syntax.parse_prop ctxt1) raw_props;
|
||||
val (props2, ctxt2) = ctxt1 |> fold_map Variable.fix_dummy_patterns props1;
|
||||
val props3 = Syntax.check_props ctxt2 props2;
|
||||
val ctxt3 = ctxt2 |> fold Variable.declare_term props3;
|
||||
in (props3, ctxt3) end;
|
||||
|
||||
|
||||
(* prepare specification *)
|
||||
|
||||
fun get_positions ctxt x =
|
||||
let
|
||||
fun get Cs (Const ("_type_constraint_", C) $ t) = get (C :: Cs) t
|
||||
| get Cs (Free (y, T)) =
|
||||
if x = y then
|
||||
map_filter Term_Position.decode_positionT
|
||||
(T :: map (Type.constraint_type ctxt) Cs)
|
||||
else []
|
||||
| get _ (t $ u) = get [] t @ get [] u
|
||||
| get _ (Abs (_, _, t)) = get [] t
|
||||
| get _ _ = [];
|
||||
in get [] end;
|
||||
|
||||
local
|
||||
|
||||
fun prep_decls prep_var raw_vars ctxt =
|
||||
let
|
||||
val (vars, ctxt') = fold_map prep_var raw_vars ctxt;
|
||||
val (xs, ctxt'') = ctxt'
|
||||
|> Context_Position.set_visible false
|
||||
|> Proof_Context.add_fixes vars
|
||||
||> Context_Position.restore_visible ctxt';
|
||||
val _ =
|
||||
Context_Position.reports ctxt''
|
||||
(map (Binding.pos_of o #1) vars ~~
|
||||
map (Variable.markup_entity_def ctxt'' ##> Properties.remove Markup.kindN) xs);
|
||||
in ((vars, xs), ctxt'') end;
|
||||
|
||||
fun close_form ctxt ys prems concl =
|
||||
let
|
||||
val xs = rev (fold (Variable.add_free_names ctxt) (prems @ [concl]) (rev ys));
|
||||
|
||||
val pos_props = Logic.strip_imp_concl concl :: Logic.strip_imp_prems concl @ prems;
|
||||
fun get_pos x = maps (get_positions ctxt x) pos_props;
|
||||
val _ = Context_Position.reports ctxt (maps (Syntax_Phases.reports_of_scope o get_pos) xs);
|
||||
in Logic.close_prop_constraint (Variable.default_type ctxt) (xs ~~ xs) prems concl end;
|
||||
|
||||
fun dummy_frees ctxt xs tss =
|
||||
let
|
||||
val names =
|
||||
Variable.names_of ((fold o fold) Variable.declare_term tss ctxt)
|
||||
|> fold Name.declare xs;
|
||||
val (tss', _) = (fold_map o fold_map) Term.free_dummy_patterns tss names;
|
||||
in tss' end;
|
||||
|
||||
fun prep_spec_open prep_var parse_prop raw_vars raw_params raw_prems raw_concl ctxt =
|
||||
let
|
||||
val ((vars, xs), vars_ctxt) = prep_decls prep_var raw_vars ctxt;
|
||||
val (ys, params_ctxt) = vars_ctxt |> fold_map prep_var raw_params |-> Proof_Context.add_fixes;
|
||||
|
||||
val props =
|
||||
map (parse_prop params_ctxt) (raw_concl :: raw_prems)
|
||||
|> singleton (dummy_frees params_ctxt (xs @ ys));
|
||||
|
||||
val concl :: prems = Syntax.check_props params_ctxt props;
|
||||
val spec = Logic.list_implies (prems, concl);
|
||||
val spec_ctxt = Variable.declare_term spec params_ctxt;
|
||||
|
||||
fun get_pos x = maps (get_positions spec_ctxt x) props;
|
||||
in ((vars, xs, get_pos, spec), spec_ctxt) end;
|
||||
|
||||
fun prep_specs prep_var parse_prop prep_att raw_vars raw_specss ctxt =
|
||||
let
|
||||
val ((vars, xs), vars_ctxt) = prep_decls prep_var raw_vars ctxt;
|
||||
|
||||
val propss0 =
|
||||
raw_specss |> map (fn ((_, raw_concl), raw_prems, raw_params) =>
|
||||
let val (ys, ctxt') = vars_ctxt |> fold_map prep_var raw_params |-> Proof_Context.add_fixes
|
||||
in (ys, map (pair ctxt') (raw_concl :: raw_prems)) end);
|
||||
val props =
|
||||
burrow (grouped 10 Par_List.map_independent (uncurry parse_prop)) (map #2 propss0)
|
||||
|> dummy_frees vars_ctxt xs
|
||||
|> map2 (fn (ys, _) => fn concl :: prems => close_form vars_ctxt ys prems concl) propss0;
|
||||
|
||||
val specs = Syntax.check_props vars_ctxt props;
|
||||
val specs' = specs |> map (DOF_core.elaborate_term ctxt)
|
||||
val specs_ctxt = vars_ctxt |> fold Variable.declare_term specs;
|
||||
|
||||
val ps = specs_ctxt |> fold_map Proof_Context.inferred_param xs |> fst;
|
||||
val params = map2 (fn (b, _, mx) => fn (_, T) => ((b, T), mx)) vars ps;
|
||||
val name_atts: Attrib.binding list =
|
||||
map (fn ((name, atts), _) => (name, map (prep_att ctxt) atts)) (map #1 raw_specss);
|
||||
in ((params, name_atts ~~ specs'), specs_ctxt) end;
|
||||
|
||||
in
|
||||
|
||||
val check_spec_open = prep_spec_open Proof_Context.cert_var (K I);
|
||||
val read_spec_open = prep_spec_open Proof_Context.read_var Syntax.parse_prop;
|
||||
|
||||
type multi_specs =
|
||||
((Attrib.binding * term) * term list * (binding * typ option * mixfix) list) list;
|
||||
type multi_specs_cmd =
|
||||
((Attrib.binding * string) * string list * (binding * string option * mixfix) list) list;
|
||||
|
||||
fun check_multi_specs xs specs =
|
||||
prep_specs Proof_Context.cert_var (K I) (K I) xs specs;
|
||||
|
||||
fun read_multi_specs xs specs =
|
||||
prep_specs Proof_Context.read_var Syntax.parse_prop Attrib.check_src xs specs;
|
||||
|
||||
end;
|
||||
|
||||
|
||||
(* axiomatization -- within global theory *)
|
||||
|
||||
fun gen_axioms prep_stmt prep_att raw_decls raw_fixes raw_prems raw_concls thy =
|
||||
let
|
||||
(*specification*)
|
||||
val ({vars, propss = [prems, concls], ...}, vars_ctxt) =
|
||||
Proof_Context.init_global thy
|
||||
|> prep_stmt (raw_decls @ raw_fixes) ((map o map) (rpair []) [raw_prems, map snd raw_concls]);
|
||||
val (decls, fixes) = chop (length raw_decls) vars;
|
||||
|
||||
val frees =
|
||||
rev ((fold o fold) (Variable.add_frees vars_ctxt) [prems, concls] [])
|
||||
|> map (fn (x, T) => (x, Free (x, T)));
|
||||
val close = Logic.close_prop (map #2 fixes @ frees) prems;
|
||||
val specs =
|
||||
map ((apsnd o map) (prep_att vars_ctxt) o fst) raw_concls ~~ map close concls;
|
||||
|
||||
val spec_name =
|
||||
Binding.conglomerate (if null decls then map (#1 o #1) specs else map (#1 o #1) decls);
|
||||
|
||||
|
||||
(*consts*)
|
||||
val (consts, consts_thy) = thy
|
||||
|> fold_map (fn ((b, _, mx), (_, t)) => Theory.specify_const ((b, Term.type_of t), mx)) decls;
|
||||
val subst = Term.subst_atomic (map (#2 o #2) decls ~~ consts);
|
||||
|
||||
(*axioms*)
|
||||
val (axioms, axioms_thy) =
|
||||
(specs, consts_thy) |-> fold_map (fn ((b, atts), prop) =>
|
||||
Thm.add_axiom_global (b, subst prop) #>> (fn (_, th) => ((b, atts), [([th], [])])));
|
||||
|
||||
(*facts*)
|
||||
val (facts, facts_lthy) = axioms_thy
|
||||
|> Named_Target.theory_init
|
||||
|> Spec_Rules.add spec_name Spec_Rules.Unknown consts (maps (maps #1 o #2) axioms)
|
||||
|> Local_Theory.notes axioms;
|
||||
|
||||
in ((consts, map (the_single o #2) facts), Local_Theory.exit_global facts_lthy) end;
|
||||
|
||||
val axiomatization = gen_axioms Proof_Context.cert_stmt (K I);
|
||||
val axiomatization_cmd = gen_axioms Proof_Context.read_stmt Attrib.check_src;
|
||||
|
||||
fun axiom (b, ax) = axiomatization [] [] [] [(b, ax)] #>> (hd o snd);
|
||||
|
||||
|
||||
(* definition *)
|
||||
|
||||
fun gen_def prep_spec prep_att raw_var raw_params raw_prems ((a, raw_atts), raw_spec) int lthy =
|
||||
let
|
||||
val atts = map (prep_att lthy) raw_atts;
|
||||
|
||||
val ((vars, xs, get_pos, spec), _) = lthy
|
||||
|> prep_spec (the_list raw_var) raw_params raw_prems raw_spec;
|
||||
val (((x, T), rhs), prove) = Local_Defs.derived_def lthy get_pos {conditional = true} spec;
|
||||
val _ = Name.reject_internal (x, []);
|
||||
val (b, mx) =
|
||||
(case (vars, xs) of
|
||||
([], []) => (Binding.make (x, (case get_pos x of [] => Position.none | p :: _ => p)), NoSyn)
|
||||
| ([(b, _, mx)], [y]) =>
|
||||
if x = y then (b, mx)
|
||||
else
|
||||
error ("Head of definition " ^ quote x ^ " differs from declaration " ^ quote y ^
|
||||
Position.here (Binding.pos_of b)));
|
||||
|
||||
val name = Thm.def_binding_optional b a;
|
||||
val ((lhs, (_, raw_th)), lthy2) = lthy
|
||||
|> Local_Theory.define_internal ((b, mx), ((Binding.suffix_name "_raw" name, []), rhs));
|
||||
|
||||
val ([(def_name, [th])], lthy3) = lthy2
|
||||
|> Local_Theory.notes [((name, atts), [([prove lthy2 raw_th], [])])];
|
||||
|
||||
val lthy4 = lthy3
|
||||
|> Spec_Rules.add name Spec_Rules.equational [lhs] [th]
|
||||
|> Code.declare_default_eqns [(th, true)];
|
||||
|
||||
val lhs' = Morphism.term (Local_Theory.target_morphism lthy4) lhs;
|
||||
|
||||
val _ =
|
||||
Proof_Display.print_consts int (Position.thread_data ()) lthy4
|
||||
(Frees.defined (Frees.build (Frees.add_frees lhs'))) [(x, T)];
|
||||
in ((lhs, (def_name, th)), lthy4) end;
|
||||
|
||||
fun definition xs ys As B = gen_def check_spec_open (K I) xs ys As B false;
|
||||
val definition_cmd = gen_def read_spec_open Attrib.check_src;
|
||||
|
||||
|
||||
(* abbreviation *)
|
||||
|
||||
fun gen_abbrev prep_spec mode raw_var raw_params raw_spec int lthy =
|
||||
let
|
||||
val lthy1 = lthy |> Proof_Context.set_syntax_mode mode;
|
||||
val ((vars, xs, get_pos, spec), _) = lthy
|
||||
|> Proof_Context.set_mode Proof_Context.mode_abbrev
|
||||
|> prep_spec (the_list raw_var) raw_params [] raw_spec;
|
||||
val ((x, T), rhs) = Local_Defs.abs_def (#2 (Local_Defs.cert_def lthy1 get_pos spec));
|
||||
val _ = Name.reject_internal (x, []);
|
||||
val (b, mx) =
|
||||
(case (vars, xs) of
|
||||
([], []) => (Binding.make (x, (case get_pos x of [] => Position.none | p :: _ => p)), NoSyn)
|
||||
| ([(b, _, mx)], [y]) =>
|
||||
if x = y then (b, mx)
|
||||
else
|
||||
error ("Head of abbreviation " ^ quote x ^ " differs from declaration " ^ quote y ^
|
||||
Position.here (Binding.pos_of b)));
|
||||
val lthy2 = lthy1
|
||||
|> Local_Theory.abbrev mode ((b, mx), rhs) |> snd
|
||||
|> Proof_Context.restore_syntax_mode lthy;
|
||||
|
||||
val _ = Proof_Display.print_consts int (Position.thread_data ()) lthy2 (K false) [(x, T)];
|
||||
in lthy2 end;
|
||||
|
||||
val abbreviation = gen_abbrev check_spec_open;
|
||||
val abbreviation_cmd = gen_abbrev read_spec_open;
|
||||
|
||||
|
||||
(* alias *)
|
||||
|
||||
fun gen_alias decl check (b, arg) lthy =
|
||||
let
|
||||
val (c, reports) = check {proper = true, strict = false} lthy arg;
|
||||
val _ = Context_Position.reports lthy reports;
|
||||
in decl b c lthy end;
|
||||
|
||||
val alias =
|
||||
gen_alias Local_Theory.const_alias (K (K (fn c => (c, []))));
|
||||
val alias_cmd =
|
||||
gen_alias Local_Theory.const_alias
|
||||
(fn flags => fn ctxt => fn (c, pos) =>
|
||||
apfst (#1 o dest_Const) (Proof_Context.check_const flags ctxt (c, [pos])));
|
||||
|
||||
val type_alias =
|
||||
gen_alias Local_Theory.type_alias (K (K (fn c => (c, []))));
|
||||
val type_alias_cmd =
|
||||
gen_alias Local_Theory.type_alias (apfst (#1 o dest_Type) ooo Proof_Context.check_type_name);
|
||||
|
||||
|
||||
(* fact statements *)
|
||||
|
||||
local
|
||||
|
||||
fun gen_theorems prep_fact prep_att add_fixes
|
||||
kind raw_facts raw_fixes int lthy =
|
||||
let
|
||||
val facts = raw_facts |> map (fn ((name, atts), bs) =>
|
||||
((name, map (prep_att lthy) atts),
|
||||
bs |> map (fn (b, more_atts) => (prep_fact lthy b, map (prep_att lthy) more_atts))));
|
||||
val (_, ctxt') = add_fixes raw_fixes lthy;
|
||||
|
||||
val facts' = facts
|
||||
|> Attrib.partial_evaluation ctxt'
|
||||
|> Attrib.transform_facts (Proof_Context.export_morphism ctxt' lthy);
|
||||
val (res, lthy') = lthy |> Local_Theory.notes_kind kind facts';
|
||||
val _ =
|
||||
Proof_Display.print_results
|
||||
{interactive = int, pos = Position.thread_data (), proof_state = false}
|
||||
lthy' ((kind, ""), res);
|
||||
in (res, lthy') end;
|
||||
|
||||
in
|
||||
|
||||
val theorems = gen_theorems (K I) (K I) Proof_Context.add_fixes;
|
||||
val theorems_cmd = gen_theorems Proof_Context.get_fact Attrib.check_src Proof_Context.add_fixes_cmd;
|
||||
|
||||
end;
|
||||
|
||||
|
||||
(* complex goal statements *)
|
||||
|
||||
local
|
||||
|
||||
fun prep_statement prep_att prep_stmt raw_elems raw_stmt ctxt =
|
||||
let
|
||||
val (stmt, elems_ctxt) = prep_stmt raw_elems raw_stmt ctxt;
|
||||
val prems = Assumption.local_prems_of elems_ctxt ctxt;
|
||||
val stmt_ctxt = fold (fold (Proof_Context.augment o fst) o snd) stmt elems_ctxt;
|
||||
in
|
||||
(case raw_stmt of
|
||||
Element.Shows _ =>
|
||||
let val stmt' = Attrib.map_specs (map prep_att) stmt
|
||||
in (([], prems, stmt', NONE), stmt_ctxt) end
|
||||
| Element.Obtains raw_obtains =>
|
||||
let
|
||||
val asms_ctxt = stmt_ctxt
|
||||
|> fold (fn ((name, _), asm) =>
|
||||
snd o Proof_Context.add_assms Assumption.assume_export
|
||||
[((name, [Context_Rules.intro_query NONE]), asm)]) stmt;
|
||||
val that = Assumption.local_prems_of asms_ctxt stmt_ctxt;
|
||||
val ([(_, that')], that_ctxt) = asms_ctxt
|
||||
|> Proof_Context.set_stmt true
|
||||
|> Proof_Context.note_thmss "" [((Binding.name Auto_Bind.thatN, []), [(that, [])])]
|
||||
||> Proof_Context.restore_stmt asms_ctxt;
|
||||
|
||||
val stmt' = [(Binding.empty_atts, [(#2 (#1 (Obtain.obtain_thesis ctxt)), [])])];
|
||||
in ((Obtain.obtains_attribs raw_obtains, prems, stmt', SOME that'), that_ctxt) end)
|
||||
end;
|
||||
|
||||
fun gen_theorem schematic bundle_includes prep_att prep_stmt
|
||||
long kind before_qed after_qed (name, raw_atts) raw_includes raw_elems raw_concl int lthy =
|
||||
let
|
||||
val _ = Local_Theory.assert lthy;
|
||||
|
||||
val elems = raw_elems |> map (Element.map_ctxt_attrib (prep_att lthy));
|
||||
val ((more_atts, prems, stmt, facts), goal_ctxt) = lthy
|
||||
|> bundle_includes raw_includes
|
||||
|> prep_statement (prep_att lthy) prep_stmt elems raw_concl;
|
||||
val atts = more_atts @ map (prep_att lthy) raw_atts;
|
||||
|
||||
val pos = Position.thread_data ();
|
||||
val print_results =
|
||||
Proof_Display.print_results {interactive = int, pos = pos, proof_state = false};
|
||||
|
||||
fun after_qed' results goal_ctxt' =
|
||||
let
|
||||
val results' =
|
||||
burrow (map (Goal.norm_result lthy) o Proof_Context.export goal_ctxt' lthy) results;
|
||||
val (res, lthy') =
|
||||
if forall (Binding.is_empty_atts o fst) stmt then (map (pair "") results', lthy)
|
||||
else
|
||||
Local_Theory.notes_kind kind
|
||||
(map2 (fn (b, _) => fn ths => (b, [(ths, [])])) stmt results') lthy;
|
||||
val lthy'' =
|
||||
if Binding.is_empty_atts (name, atts)
|
||||
then (print_results lthy' ((kind, ""), res); lthy')
|
||||
else
|
||||
let
|
||||
val ([(res_name, _)], lthy'') =
|
||||
Local_Theory.notes_kind kind [((name, atts), [(maps #2 res, [])])] lthy';
|
||||
val _ = print_results lthy' ((kind, res_name), res);
|
||||
in lthy'' end;
|
||||
in after_qed results' lthy'' end;
|
||||
|
||||
val prems_name = if long then Auto_Bind.assmsN else Auto_Bind.thatN;
|
||||
in
|
||||
goal_ctxt
|
||||
|> not (null prems) ?
|
||||
(Proof_Context.note_thmss "" [((Binding.name prems_name, []), [(prems, [])])] #> snd)
|
||||
|> Proof.theorem before_qed after_qed' (map snd stmt)
|
||||
|> (case facts of NONE => I | SOME ths => Proof.refine_insert ths)
|
||||
|> tap (fn state => not schematic andalso Proof.schematic_goal state andalso
|
||||
error "Illegal schematic goal statement")
|
||||
end;
|
||||
|
||||
in
|
||||
|
||||
val theorem =
|
||||
gen_theorem false Bundle.includes (K I) Expression.cert_statement;
|
||||
val theorem_cmd =
|
||||
gen_theorem false Bundle.includes_cmd Attrib.check_src Expression.read_statement;
|
||||
|
||||
val schematic_theorem =
|
||||
gen_theorem true Bundle.includes (K I) Expression.cert_statement;
|
||||
val schematic_theorem_cmd =
|
||||
gen_theorem true Bundle.includes_cmd Attrib.check_src Expression.read_statement;
|
||||
|
||||
end;
|
||||
|
||||
end;
|
|
@ -0,0 +1,237 @@
|
|||
(*************************************************************************
|
||||
* Copyright (C)
|
||||
* 2019 The University of Exeter
|
||||
* 2018-2019 The University of Paris-Saclay
|
||||
* 2018 The University of Sheffield
|
||||
*
|
||||
* License:
|
||||
* This program can be redistributed and/or modified under the terms
|
||||
* of the 2-clause BSD-style license.
|
||||
*
|
||||
* SPDX-License-Identifier: BSD-2-Clause
|
||||
*************************************************************************)
|
||||
|
||||
chapter\<open>Testing Freeform and Formal Elements from the scholarly-paper Ontology\<close>
|
||||
|
||||
theory
|
||||
AssnsLemmaThmEtc
|
||||
imports
|
||||
"Isabelle_DOF-Ontologies.Conceptual"
|
||||
"Isabelle_DOF.scholarly_paper"
|
||||
"Isabelle_DOF_Unit_Tests_document"
|
||||
TestKit
|
||||
begin
|
||||
|
||||
section\<open>Test Objective\<close>
|
||||
|
||||
text\<open>Testing Core Elements for \<^theory>\<open>Isabelle_DOF.scholarly_paper\<close> wrt. to
|
||||
existance, controlability via implicit and explicit default classes, and potential
|
||||
LaTeX Layout.\<close>
|
||||
|
||||
text\<open>Current status:\<close>
|
||||
print_doc_classes
|
||||
print_doc_items
|
||||
|
||||
|
||||
section\<open>An Example for use-before-declaration of Formal Content\<close>
|
||||
|
||||
text*[aa::F, properties = "[@{term ''True''}]"]
|
||||
\<open>Our definition of the HOL-Logic has the following properties:\<close>
|
||||
assert*\<open>F.properties @{F \<open>aa\<close>} = [@{term ''True''}]\<close>
|
||||
|
||||
text\<open>For now, as the term annotation is not bound to a meta logic which will translate
|
||||
\<^term>\<open>[@{term ''True''}]\<close> to \<^term>\<open>[True]\<close>, we can not use the HOL \<^const>\<open>True\<close> constant
|
||||
in the assertion.\<close>
|
||||
|
||||
ML\<open> @{term_ "[@{term \<open>True \<longrightarrow> True \<close>}]"}; (* with isa-check *) \<close>
|
||||
|
||||
ML\<open>
|
||||
(* Checking the default classes which should be in a neutral(unset) state. *)
|
||||
(* Note that in this state, the "implicit default" is "math_content". *)
|
||||
@{assert} (Config.get_global @{theory} Definition_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Lemma_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Theorem_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Proposition_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Premise_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Corollary_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Consequence_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Assumption_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Hypothesis_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Consequence_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Assertion_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Proof_default_class = "");
|
||||
@{assert} (Config.get_global @{theory} Example_default_class = "");
|
||||
\<close>
|
||||
|
||||
|
||||
Definition*[e1]\<open>Lorem ipsum dolor sit amet, ... \<close>
|
||||
text\<open>Note that this should yield a warning since \<^theory_text>\<open>Definition*\<close> uses as "implicit default" the class
|
||||
\<^doc_class>\<open>math_content\<close> which has no \<^term>\<open>text_element.level\<close> set, however in this context,
|
||||
it is required to be a positive number since it is \<^term>\<open>text_element.referentiable\<close> .
|
||||
This is intended behaviour in order to give the user a nudge to be more specific.\<close>
|
||||
|
||||
text\<open>A repair looks like this:\<close>
|
||||
declare [[Definition_default_class = "definition"]]
|
||||
|
||||
text\<open>Now, define a forward reference to the formal content: \<close>
|
||||
|
||||
declare_reference*[e1bisbis::"definition"]
|
||||
|
||||
text\<open>... which makes it possible to refer in a freeform definition to its formal counterpart
|
||||
which will appear textually later. With this pragmatics, an "out-of- order-presentation"
|
||||
can be achieved within \<^theory>\<open>Isabelle_DOF.scholarly_paper\<close> for the most common cases.\<close>
|
||||
|
||||
|
||||
(*<*) (* PDF references to definition* not implemented *)
|
||||
Definition*[e1bis::"definition", short_name="\<open>Nice lemma.\<close>"]
|
||||
\<open>Lorem ipsum dolor sit amet, ...
|
||||
This is formally defined as follows in @{definition (unchecked) "e1bisbis"}\<close>
|
||||
definition*[e1bisbis, status=formal] e :: int where "e = 2"
|
||||
(*>*)
|
||||
section\<open>Tests for Theorems, Assertions, Assumptions, Hypothesis, etc.\<close>
|
||||
|
||||
declare [[Theorem_default_class = "theorem",
|
||||
Premise_default_class = "premise",
|
||||
Hypothesis_default_class = "hypothesis",
|
||||
Assumption_default_class = "assumption",
|
||||
Conclusion_default_class = "conclusion",
|
||||
Consequence_default_class = "consequence",
|
||||
Assertion_default_class = "assertion",
|
||||
Corollary_default_class = "corollary",
|
||||
Proof_default_class = "math_proof",
|
||||
Conclusion_default_class = "conclusion_stmt"]]
|
||||
|
||||
Theorem*[e2]\<open>... suspendisse non arcu malesuada mollis, nibh morbi, ... \<close>
|
||||
|
||||
theorem*[e2bis::"theorem", status=formal] f : "e = 1+1" unfolding e_def by simp
|
||||
|
||||
(*<*) (* @{theorem "e2bis"} breaks LaTeX generation ... *)
|
||||
Lemma*[e3,level="Some 2"]
|
||||
\<open>... phasellus amet id massa nunc, pede suscipit repellendus, ... @{theorem "e2bis"} \<close>
|
||||
(*>*)
|
||||
Proof*[d10, short_name="\<open>Induction over Tinea pedis.\<close>"]\<open>Freeform Proof\<close>
|
||||
|
||||
lemma*[dfgd::"lemma"] q: "All (\<lambda>x. X \<and> Y \<longrightarrow> True)" oops
|
||||
text-assert-error\<open>@{lemma dfgd} \<close>\<open>Undefined instance:\<close> \<comment> \<open>oops‘ed objects are not referentiable.\<close>
|
||||
|
||||
text\<open>... in ut tortor eleifend augue pretium consectetuer...
|
||||
Lectus accumsan velit ultrices, ...\<close>
|
||||
|
||||
Proposition*[d2::"proposition"]\<open>"Freeform Proposition"\<close>
|
||||
|
||||
Assumption*[d3] \<open>"Freeform Assertion"\<close>
|
||||
|
||||
Premise*[d4]\<open>"Freeform Premise"\<close>
|
||||
|
||||
Corollary*[d5]\<open>"Freeform Corollary"\<close>
|
||||
|
||||
Consequence*[d6::scholarly_paper.consequence]\<open>"Freeform Consequence"\<close> \<comment> \<open>longname just for test\<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*[ababa::scholarly_paper.assertion]
|
||||
Assertion*[d7]\<open>Freeform Assumption with forward reference to the formal
|
||||
@{assertion (unchecked) ababa}.\<close>
|
||||
assert*[ababa::assertion] "3 < (4::int)"
|
||||
assert*[ababab::assertion] "0 < (4::int)"
|
||||
(*>*)
|
||||
|
||||
Conclusion*[d8]\<open>"Freeform Conclusion"\<close>
|
||||
|
||||
Hypothesis*[d9]\<open>"Freeform Hypothesis"\<close>
|
||||
|
||||
Example*[d11::math_example]\<open>"Freeform Example"\<close>
|
||||
|
||||
text\<open>An example for the ontology specification character of the short-cuts such as
|
||||
@{command "assert*"}: in the following, we use the same notation referring to a completely
|
||||
different class. "F" and "assertion" have only in common that they posses the attribute
|
||||
@{const [names_short] \<open>properties\<close>}: \<close>
|
||||
|
||||
section\<open>Exhaustive Scholarly\_paper Test\<close>
|
||||
|
||||
subsection\<open>Global Structural Elements\<close>
|
||||
(* maybe it is neither necessary nor possible to test these here... title is unique in
|
||||
a document, for example. To be commented out of needed. *)
|
||||
text*[tt1::scholarly_paper.title]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt2::scholarly_paper.author]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt3::scholarly_paper.article]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt4::scholarly_paper.annex]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt5::scholarly_paper.abstract]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt6::scholarly_paper.subtitle]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt7::scholarly_paper.bibliography]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt8::scholarly_paper.introduction]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt9::scholarly_paper.related_work]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt11::scholarly_paper.text_section]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt12::scholarly_paper.background ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt13::scholarly_paper.conclusion ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
|
||||
subsection\<open>Technical Content Specific Elements\<close>
|
||||
|
||||
text*[tu1::scholarly_paper.axiom ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu1bis::scholarly_paper.math_content, mcc="axm" ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu2::scholarly_paper.lemma ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu3::scholarly_paper.example ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu4::scholarly_paper.premise ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu5::scholarly_paper.theorem ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu6::scholarly_paper.assertion]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu7::scholarly_paper.corollary]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu9::scholarly_paper.technical]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu10::scholarly_paper.assumption ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu13::scholarly_paper.definition ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu15::scholarly_paper.experiment ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu16::scholarly_paper.hypothesis ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu17::scholarly_paper.math_proof ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu18::scholarly_paper.consequence]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu19::scholarly_paper.math_formal]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu20::scholarly_paper.proposition]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu21::scholarly_paper.math_content ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu22::scholarly_paper.math_example ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu23::scholarly_paper.conclusion_stmt ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu24::scholarly_paper.math_motivation ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu25::scholarly_paper.tech_definition ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu28::scholarly_paper.eng_example ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tt10::scholarly_paper.tech_example]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu8::scholarly_paper.tech_code] \<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu27::scholarly_paper.engineering_content]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
text*[tu14::scholarly_paper.evaluation ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
|
||||
text\<open> @{axiom tu1} @{lemma tu2} @{example tu3} @{premise tu4} @{theorem tu5} @{assertion tu6}
|
||||
@{technical tu9} @{assumption tu10 } @{definition tu13 }
|
||||
@{experiment tu15 } @{hypothesis tu16 } @{math_proof tu17 }
|
||||
@{consequence tu18 } @{math_formal tu19 } @{proposition tu20 }
|
||||
@{math_content tu21 } @{math_example tu22 } @{conclusion_stmt tu23 }
|
||||
@{math_motivation tu24 } @{tech_definition tu25 } @{eng_example tu28 }
|
||||
@{tech_example tt10 } @{tech_code tu8 } @{engineering_content tu27 }
|
||||
@{evaluation tu14 }
|
||||
\<close>
|
||||
|
||||
subsection\<open>The Use in Macros\<close>
|
||||
|
||||
Lemma*[ttu2::scholarly_paper.lemma ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Example*[ttu3::scholarly_paper.math_example ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Premise*[ttu4::scholarly_paper.premise ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Theorem*[ttu5::scholarly_paper.theorem ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Assertion*[ttu6::scholarly_paper.assertion]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Corollary*[ttu7::scholarly_paper.corollary]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Assumption*[ttu10::scholarly_paper.assumption ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Definition*[ttu13::scholarly_paper.definition ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Hypothesis*[ttu16::scholarly_paper.hypothesis ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Proof*[ttu17::scholarly_paper.math_proof ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Consequence*[ttu18::scholarly_paper.consequence]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Proposition*[ttu20::scholarly_paper.proposition]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
Conclusion*[ttu23::scholarly_paper.conclusion_stmt ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
(* Definition*[ttu25::scholarly_paper.tech_definition ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
interesting modeling bug.
|
||||
*)
|
||||
(*Example*[ttu28::scholarly_paper.eng_example ]\<open>Lectus accumsan velit ultrices, ...\<close>
|
||||
interesting modeling bug.
|
||||
*)
|
||||
text\<open> @{lemma ttu2} @{math_example ttu3} @{premise ttu4} @{theorem ttu5} @{assertion ttu6}
|
||||
@{assumption ttu10 } @{definition ttu13 }
|
||||
@{hypothesis ttu16 } @{math_proof ttu17 }
|
||||
@{consequence ttu18 } @{proposition ttu20 }
|
||||
@{math_content tu21 } @{conclusion_stmt ttu23 }
|
||||
@ \<open>{eng_example ttu28 }\<close>
|
||||
@ \<open>{tech_example tt10 }\<close>
|
||||
\<close>
|
||||
|
||||
end
|