This commit is contained in:
commit
1aeb275ea6
51
CHANGELOG.md
51
CHANGELOG.md
@ -1,51 +0,0 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
### Added
|
||||
|
||||
### Changed
|
||||
|
||||
- Updated Isabelle version to Isabelle 2023
|
||||
|
||||
## [1.3.0] - 2022-07-08
|
||||
|
||||
### Changed
|
||||
|
||||
- The project-specific configuration is not part of the `ROOT` file, the formerly
|
||||
used `isadof.cfg` is obsolete and no longer supported.
|
||||
- Removed explicit use of `document/build` script. Requires removing the `build` script
|
||||
entry from ROOT files.
|
||||
- Isabelle/DOF is now a proper Isabelle component that should be installed using the
|
||||
`isabelle components` command. The installation script is now only a convenient way
|
||||
of installing the required AFP entries.
|
||||
- `mkroot_DOF` has been renamed to `dof_mkroot` (and reimplemented in Scala).
|
||||
|
||||
## [1.2.0] - 2022-03-26
|
||||
|
||||
## [1.1.0] - 2021-03-20
|
||||
|
||||
### Added
|
||||
|
||||
- New antiquotations, consistency checks
|
||||
|
||||
### Changed
|
||||
|
||||
- Updated manual
|
||||
- Restructured setup for ontologies (Isabelle theories and LaTeX styles)
|
||||
|
||||
## 1.0.0 - 2018-08-18
|
||||
|
||||
### Added
|
||||
|
||||
- First public release
|
||||
|
||||
[Unreleased]: https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/compare/v1.3.0/Isabelle2021...HEAD
|
||||
[1.3.0]: https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/compare/v1.2.0/Isabelle2021...v1.3.0/Isabelle2021-1
|
||||
[1.2.0]: https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/compare/v1.1.0/Isabelle2021...v1.2.0/Isabelle2021
|
||||
[1.1.0]: https://git.logicalhacking.com/Isabelle_DOF/Isabelle_DOF/compare/v1.0.0/Isabelle2019...v1.1.0/Isabelle2021
|
||||
@ -58,9 +58,6 @@
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
\renewcommand{\DOFauthor}{}
|
||||
\renewcommand{\DOFinstitute}{}
|
||||
%\expandafter\newcommand\csname 2authand\endcsname{}
|
||||
\expandafter\newcommand\csname 3authand\endcsname{}
|
||||
\expandafter\newcommand\csname 4authand\endcsname{}
|
||||
|
||||
|
||||
\begin{document}
|
||||
|
||||
20
Isabelle_DOF-Scaffold/Isabelle_DOF-Scaffold.thy
Normal file
20
Isabelle_DOF-Scaffold/Isabelle_DOF-Scaffold.thy
Normal file
@ -0,0 +1,20 @@
|
||||
theory
|
||||
"Isabelle_DOF-Scaffold"
|
||||
imports
|
||||
Isabelle_DOF.scholarly_paper
|
||||
begin
|
||||
|
||||
list_templates
|
||||
use_template "scrartcl"
|
||||
list_ontologies
|
||||
use_ontology "scholarly_paper"
|
||||
|
||||
title* [tit::title]\<open>Formal Verification of Security Protocols\<close>
|
||||
author*[alice, email = "\<open>alice@example.com\<close>",
|
||||
http_site = "\<open>https://example.com/alice\<close>",
|
||||
affiliation = "\<open>Wonderland University\<close>"]\<open>Alice\<close>
|
||||
author*[bob, email = "\<open>bob@example.com\<close>",
|
||||
http_site = "\<open>https://example.com/bob\<close>",
|
||||
affiliation = "\<open>Wonderland University\<close>"]\<open>Bob\<close>
|
||||
|
||||
end
|
||||
9
Isabelle_DOF-Scaffold/ROOT
Normal file
9
Isabelle_DOF-Scaffold/ROOT
Normal file
@ -0,0 +1,9 @@
|
||||
session "Isabelle_DOF-Scaffold" = "Isabelle_DOF" +
|
||||
options [document = pdf, document_output = "output", document_build = dof]
|
||||
(*theories [document = false]
|
||||
A
|
||||
B*)
|
||||
theories
|
||||
"Isabelle_DOF-Scaffold"
|
||||
document_files
|
||||
"preamble.tex"
|
||||
1
Isabelle_DOF-Scaffold/document/preamble.tex
Normal file
1
Isabelle_DOF-Scaffold/document/preamble.tex
Normal file
@ -0,0 +1 @@
|
||||
%% This is a placeholder for user-specific configuration and packages.
|
||||
@ -41,10 +41,12 @@ required by Isabelle/DOF. Note that this script will only work, if the AFP is
|
||||
not registered as an Isabelle component. It can be executed as follows:
|
||||
|
||||
```console
|
||||
foo@bar:~$ isabelle env install-afp
|
||||
foo@bar:~$ isabelle components -u .
|
||||
foo@bar:~$ isabelle ./env ./install-afp
|
||||
```
|
||||
|
||||
Note that this option is not supported for the development version of Isabelle.
|
||||
If the last step crashes, it may help to add 'AFP' into the toplevel ROOTS file.
|
||||
|
||||
## Installation
|
||||
|
||||
@ -66,6 +68,14 @@ foo@bar:~$ isabelle build -D . -x Isabelle_DOF-Proofs -x HOL-Proofs
|
||||
|
||||
This will compile Isabelle/DOF and run the example suite.
|
||||
|
||||
For building the session ``Isabelle_DOF-Proofs``, the timeout might need to
|
||||
increased to avoid timeouts during building the dependencies:
|
||||
|
||||
```console
|
||||
foo@bar:~$ isabelle build -d . -o 'timeout_scale=2' Isabelle_DOF-Proofs
|
||||
```
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
Overall, the use of the development version follows the description available
|
||||
|
||||
1
ROOTS
1
ROOTS
@ -6,3 +6,4 @@ Isabelle_DOF-Example-I
|
||||
Isabelle_DOF-Example-II
|
||||
Isabelle_DOF-Examples-Extra
|
||||
Isabelle_DOF-Examples-Templates
|
||||
Isabelle_DOF-Scaffold
|
||||
|
||||
187
install-afp
187
install-afp
@ -1,187 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
# Copyright (c) 2018-2019 The University of Sheffield.
|
||||
# 2019-2020 The University of Exeter.
|
||||
# 2018-2020 The University of Paris-Saclay.
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions
|
||||
# are met:
|
||||
# 1. Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# 2. Redistributions in binary form must reproduce the above copyright
|
||||
# notice, this list of conditions and the following disclaimer in
|
||||
# the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
|
||||
# FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
|
||||
# COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
|
||||
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
||||
# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
|
||||
# ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
|
||||
# POSSIBILITY OF SUCH DAMAGE.
|
||||
#
|
||||
# SPDX-License-Identifier: BSD-2-Clause
|
||||
|
||||
#set -e
|
||||
shopt -s nocasematch
|
||||
|
||||
|
||||
print_help()
|
||||
{
|
||||
echo "Usage: isabelle env ./install-afp [OPTION] "
|
||||
echo ""
|
||||
echo "Warning: This tools is deprecated."
|
||||
echo ""
|
||||
echo "Run ..."
|
||||
echo ""
|
||||
echo " --help, -h display this help message"
|
||||
}
|
||||
|
||||
|
||||
|
||||
exit_error() {
|
||||
echo ""
|
||||
echo " *** Local AFP installation FAILED, please check the README.md for help ***"
|
||||
echo ""
|
||||
exit 1
|
||||
}
|
||||
|
||||
confirm_usage() {
|
||||
echo "* From Isabelle2021-1 on, the recommended method for making the whole AFP "
|
||||
echo " available to Isabelle is the isabelle components -u command."
|
||||
echo " For doing so, please follow the instructions at: "
|
||||
echo " https://www.isa-afp.org/help/"
|
||||
echo ""
|
||||
echo " Alternatively, you can continue, on your own risk, to install only"
|
||||
echo " the AFP entries required to run Isabelle/DOF."
|
||||
echo ""
|
||||
read -p " Still continue (y/N)? " -n 1 -r
|
||||
echo
|
||||
if [[ $REPLY =~ ^[Yy]$ ]];
|
||||
then
|
||||
echo " Continuing installation on your OWN risk."
|
||||
else
|
||||
exit_error
|
||||
fi
|
||||
}
|
||||
check_isabelle_version() {
|
||||
echo "* Checking Isabelle version:"
|
||||
if [ "Isabelle" == "$ACTUAL_ISABELLE_VERSION" ]; then
|
||||
echo " ERROR:"
|
||||
echo " This script does not support the development version of Isabelle."
|
||||
echo " The recommended way for installing the AFP for the development"
|
||||
echo " version of Isabelle is to clone the AFP repository:"
|
||||
echo " https://foss.heptapod.net/isa-afp/afp-devel/"
|
||||
exit_error
|
||||
fi
|
||||
if [ "$ISABELLE_VERSION" != "$ACTUAL_ISABELLE_VERSION" ]; then
|
||||
echo " WARNING:"
|
||||
echo " The version of Isabelle (i.e., $ACTUAL_ISABELLE_VERSION) you are using"
|
||||
echo " IS NOT SUPPORTED"
|
||||
echo " by the current version of Isabelle/DOF. Please install a supported"
|
||||
echo " version of Isabelle and rerun the install script, providing the"
|
||||
echo " the \"isabelle\" command as argument."
|
||||
echo " Isabelle ($ISABELLE_VERSION) can be obtained from:"
|
||||
echo " https://isabelle.in.tum.de/website-$ISABELLE_VERSION/"
|
||||
echo ""
|
||||
read -p " Still continue (y/N)? " -n 1 -r
|
||||
echo
|
||||
if [[ $REPLY =~ ^[Yy]$ ]];
|
||||
then
|
||||
echo " Continuing installation on your OWN risk."
|
||||
else
|
||||
exit_error
|
||||
fi
|
||||
else
|
||||
echo " Success: found supported Isabelle version ($ISABELLE_VERSION)"
|
||||
fi
|
||||
}
|
||||
|
||||
check_afp_entries() {
|
||||
echo "* Checking availability of AFP entries:"
|
||||
missing=""
|
||||
required="Regular-Sets Functional-Automata Physical_Quantities Metalogic_ProofChecker"
|
||||
for afp in $required; do
|
||||
res=`$ISABELLE_TOOL build -n $afp 2>/dev/null || true`
|
||||
if [ "$res" != "" ]; then
|
||||
echo " Success: found APF entry $afp."
|
||||
else
|
||||
echo " Warning: could not find AFP entry $afp."
|
||||
missing="$missing $afp"
|
||||
fi
|
||||
done
|
||||
if [ "$missing" != "" ]; then
|
||||
echo " Trying to install AFP (this might take a few *minutes*) ...."
|
||||
extract=""
|
||||
for e in $missing; do
|
||||
extract="$extract $AFP_DATE/thys/$e"
|
||||
done
|
||||
mkdir -p .afp
|
||||
if curl -s -L $AFP_URL | tar zxf - -C .afp $extract; then
|
||||
for e in $missing; do
|
||||
echo " Registering $e"
|
||||
$ISABELLE_TOOL components -u "$PWD/.afp/$AFP_DATE/thys/$e"
|
||||
done
|
||||
echo " AFP installation successful."
|
||||
else
|
||||
echo " FAILURE: could not find AFP entries: $missing."
|
||||
echo " Please obtain the AFP from"
|
||||
echo " $AFP_URL"
|
||||
echo " and follow the following instructions:"
|
||||
echo " https://www.isa-afp.org/using.html"
|
||||
exit_error
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
while [ $# -gt 0 ]
|
||||
do
|
||||
case "$1" in
|
||||
--help|-h)
|
||||
print_help
|
||||
exit 0;;
|
||||
*) print_help
|
||||
exit 1;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
if [ -z ${ISABELLE_TOOL+x} ];
|
||||
then
|
||||
print_help
|
||||
exit 1
|
||||
fi
|
||||
|
||||
ACTUAL_ISABELLE_VERSION=`$ISABELLE_TOOL version`
|
||||
|
||||
ISABELLE_VERSION="Isabelle$($ISABELLE_TOOL dof_param -b isabelle_version)"
|
||||
|
||||
if [ ${ISABELLE_VERSION} = "Isabelle" ];
|
||||
then
|
||||
echo "Error: cannot find Isabelle/DOF configuration, please check that you"
|
||||
echo " registered Isabelle/DOF as an Isabelle component, e.g., using"
|
||||
echo " isabelle components -u ."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
|
||||
|
||||
AFP_DATE="$($ISABELLE_TOOL dof_param -b afp_version)"
|
||||
AFP_URL="https://www.isa-afp.org/release/"$AFP_DATE".tar.gz"
|
||||
|
||||
echo ""
|
||||
echo "Isabelle/DOF AFP Installation Utility"
|
||||
echo "====================================="
|
||||
confirm_usage
|
||||
check_isabelle_version
|
||||
check_afp_entries
|
||||
echo "* AFP Installation successful."
|
||||
echo " You should now be able to enjoy Isabelle/DOF by building its session"
|
||||
echo " and all example documents by executing:"
|
||||
echo " $ISABELLE_TOOL build -D ."
|
||||
exit 0
|
||||
@ -24,10 +24,6 @@
|
||||
\IfFileExists{hvlogos.sty}{\usepackage{hvlogos}}{\newcommand{\TeXLive}{\TeX Live}\newcommand{\BibTeX}{Bib\TeX}}
|
||||
\usepackage{railsetup}
|
||||
\setcounter{secnumdepth}{2}
|
||||
\usepackage{index}
|
||||
\newcommand{\bindex}[1]{\index{#1|textbf}}
|
||||
%\makeindex
|
||||
%\AtEndDocument{\printindex}
|
||||
|
||||
\newcommand{\dof}{DOF\xspace}
|
||||
\newcommand{\isactrlemph}{*}
|
||||
@ -112,9 +108,10 @@ France, and therefore granted with public funds of the Program ``Investissements
|
||||
\end{minipage}
|
||||
}
|
||||
% Index setup
|
||||
\usepackage{index}
|
||||
\makeindex
|
||||
\AtEndDocument{\printindex}
|
||||
%\usepackage{index}
|
||||
\newcommand{\bindex}[1]{\index{#1|textbf}}
|
||||
%\makeindex
|
||||
%\AtEndDocument{\printindex}
|
||||
|
||||
\newcommand{\DOFindex}[2]{%
|
||||
\marginnote{\normalfont\textbf{#1}: #2}%
|
||||
@ -122,3 +119,5 @@ France, and therefore granted with public funds of the Program ``Investissements
|
||||
}%
|
||||
|
||||
\AtBeginDocument{\isabellestyle{literal}\newcommand{\lstnumberautorefname}{Line}}
|
||||
\endinput
|
||||
%:%file=$AFP/Isabelle_DOF/document/preamble.tex%:%
|
||||
|
||||
@ -283,7 +283,7 @@ doc_class "corollary" = math_content +
|
||||
invariant d :: "mcc \<sigma> = corr"
|
||||
|
||||
|
||||
text\<open>A premise or premiss[a] is a proposition — a true or false declarative statement—
|
||||
text\<open>A premise or premiss[a] is a proposition --- a true or false declarative statement---
|
||||
used in an argument to prove the truth of another proposition called the conclusion.\<close>
|
||||
doc_class "premise" = math_content +
|
||||
referentiable :: bool <= "True"
|
||||
@ -667,9 +667,9 @@ section\<open>Miscelleous\<close>
|
||||
|
||||
subsection\<open>Common Abbreviations\<close>
|
||||
|
||||
define_shortcut* eg \<rightleftharpoons> \<open>\eg\<close> (* Latin: „exempli gratia“ meaning „for example“. *)
|
||||
ie \<rightleftharpoons> \<open>\ie\<close> (* Latin: „id est“ meaning „that is to say“. *)
|
||||
etc \<rightleftharpoons> \<open>\etc\<close> (* Latin : „et cetera“ meaning „et cetera“ *)
|
||||
define_shortcut* eg \<rightleftharpoons> \<open>\eg\<close> (* Latin: "exempli gratia" meaning "for example". *)
|
||||
ie \<rightleftharpoons> \<open>\ie\<close> (* Latin: "id est" meaning "that is to say". *)
|
||||
etc \<rightleftharpoons> \<open>\etc\<close> (* Latin : "et cetera" meaning "et cetera" *)
|
||||
|
||||
print_doc_classes
|
||||
|
||||
|
||||
@ -85,10 +85,10 @@ doc_class report =
|
||||
section\<open>Experimental\<close>
|
||||
|
||||
(* switch on regexp syntax *)
|
||||
notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
|
||||
notation Plus (infixr "||" 55)
|
||||
notation Times (infixr "~~" 60)
|
||||
notation Atom ("\<lfloor>_\<rfloor>" 65)
|
||||
notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
|
||||
notation Plus (infixr \<open>||\<close> 55)
|
||||
notation Times (infixr \<open>~~\<close> 60)
|
||||
notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
|
||||
|
||||
|
||||
|
||||
|
||||
@ -50,8 +50,8 @@ object DOF {
|
||||
val session = "Isabelle_DOF"
|
||||
val session_ontologies = "Isabelle_DOF-Ontologies"
|
||||
|
||||
val latest_version = "1.3.0"
|
||||
val latest_isabelle = "Isabelle2021-1"
|
||||
val latest_version = "2025"
|
||||
val latest_isabelle = "Isabelle2025"
|
||||
val latest_doi = "10.5281/zenodo.6810799"
|
||||
val generic_doi = "10.5281/zenodo.3370482"
|
||||
|
||||
|
||||
@ -410,7 +410,7 @@ fun convert_meta_args ctxt (X, (((str,_),value) :: R)) =
|
||||
let fun conv_int x = snd(HOLogic.dest_number(Syntax.read_term ctxt x))
|
||||
handle TERM _ => error "Illegal int format."
|
||||
in
|
||||
(case YXML.content_of str of
|
||||
(case Protocol_Message.clean_output str of
|
||||
"relative_width" => upd_relative_width (K (conv_int value))
|
||||
o convert_meta_args ctxt (X, R)
|
||||
| "relative_height" => upd_relative_height (K (conv_int value))
|
||||
@ -421,7 +421,7 @@ fun convert_meta_args ctxt (X, (((str,_),value) :: R)) =
|
||||
|convert_meta_args _ (_,[]) = I
|
||||
|
||||
fun convert_src_from_margs ctxt (X, (((str,_),value)::R)) =
|
||||
(case YXML.content_of str of
|
||||
(case Protocol_Message.clean_output str of
|
||||
"file_src" => Input.string (HOLogic.dest_string (Syntax.read_term ctxt value))
|
||||
| _ => convert_src_from_margs ctxt (X,R))
|
||||
|convert_src_from_margs _ (_, []) = error("No file_src provided.")
|
||||
@ -851,12 +851,12 @@ val _ = block_antiquotation \<^binding>\<open>block\<close> (block_modes -- Scan
|
||||
|> Theory.setup
|
||||
|
||||
fun convert_meta_args ctxt (X, (((str,_),value) :: R)) =
|
||||
(case YXML.content_of str of
|
||||
"frametitle" => upd_frametitle (K(YXML.content_of value |> read_string))
|
||||
(case Protocol_Message.clean_output str of
|
||||
"frametitle" => upd_frametitle (K(Protocol_Message.clean_output value |> read_string))
|
||||
o convert_meta_args ctxt (X, R)
|
||||
| "framesubtitle" => upd_framesubtitle (K(YXML.content_of value |> read_string))
|
||||
| "framesubtitle" => upd_framesubtitle (K(Protocol_Message.clean_output value |> read_string))
|
||||
o convert_meta_args ctxt (X, R)
|
||||
| "options" => upd_options (K(YXML.content_of value |> read_string))
|
||||
| "options" => upd_options (K(Protocol_Message.clean_output value |> read_string))
|
||||
o convert_meta_args ctxt (X, R)
|
||||
| s => error("!undefined attribute:"^s))
|
||||
| convert_meta_args _ (_,[]) = I
|
||||
|
||||
@ -124,7 +124,7 @@ struct
|
||||
Name_Space.check (Context.Theory thy)
|
||||
(get_onto_classes (Proof_Context.init_global thy)) (name, Position.none) |> #2
|
||||
|
||||
fun markup2string s = YXML.content_of s
|
||||
fun markup2string s = Protocol_Message.clean_output s
|
||||
|> Symbol.explode
|
||||
|> List.filter (fn c => c <> Symbol.DEL)
|
||||
|> String.concat
|
||||
@ -876,6 +876,14 @@ val long_doc_class_prefix = ISA_prefix ^ "long_doc_class_"
|
||||
|
||||
fun is_ISA s = String.isPrefix ISA_prefix (Long_Name.base_name s)
|
||||
|
||||
val strip_positions_is_ISA =
|
||||
let
|
||||
fun strip ((t as Const ("_type_constraint_", \<^Type>\<open>fun A _\<close>)) $ (u as Const (s,_))) =
|
||||
if Term_Position.detect_positionT A andalso is_ISA s then u else t $ u
|
||||
| strip (t $ u) = strip t $ strip u
|
||||
| strip (Abs (a, T, b)) = Abs (a, T, strip b)
|
||||
| strip t = t;
|
||||
in strip end;
|
||||
|
||||
fun transduce_term_global {mk_elaboration=mk_elaboration} (term,pos) thy =
|
||||
(* pre: term should be fully typed in order to allow type-related term-transformations *)
|
||||
@ -908,24 +916,27 @@ fun transduce_term_global {mk_elaboration=mk_elaboration} (term,pos) thy =
|
||||
else Const(s, ty)
|
||||
| T(Abs(s,ty,t)) = Abs(s,ty,T t)
|
||||
| T t = t
|
||||
in T term end
|
||||
in T (strip_positions_is_ISA term) end
|
||||
|
||||
(* Elaborate an Isabelle_DOF term-antiquotation from an only parsed-term (not checked) *)
|
||||
fun parsed_elaborate ctxt pos (Const(s,ty) $ t) =
|
||||
if is_ISA s
|
||||
then Syntax.check_term ctxt (Const(s, ty) $ t)
|
||||
|> (fn t => transduce_term_global {mk_elaboration=true} (t , pos)
|
||||
(Proof_Context.theory_of ctxt))
|
||||
else (Const(s,ty) $ (parsed_elaborate ctxt pos t))
|
||||
| parsed_elaborate ctxt pos (t1 $ t2) = parsed_elaborate ctxt pos (t1) $ parsed_elaborate ctxt pos (t2)
|
||||
| parsed_elaborate ctxt pos (Const(s,ty)) =
|
||||
if is_ISA s
|
||||
then Syntax.check_term ctxt (Const(s, ty))
|
||||
|> (fn t => transduce_term_global {mk_elaboration=true} (t , pos)
|
||||
(Proof_Context.theory_of ctxt))
|
||||
else Const(s,ty)
|
||||
| parsed_elaborate ctxt pos (Abs(s,ty,t)) = Abs (s,ty,parsed_elaborate ctxt pos t)
|
||||
| parsed_elaborate _ _ t = t
|
||||
fun parsed_elaborate ctxt pos =
|
||||
let
|
||||
fun elaborate (Const(s,ty) $ t) =
|
||||
if is_ISA s
|
||||
then Syntax.check_term ctxt (Const(s, ty) $ t)
|
||||
|> (fn t => transduce_term_global {mk_elaboration=true} (t , pos)
|
||||
(Proof_Context.theory_of ctxt))
|
||||
else (Const(s,ty) $ (elaborate t))
|
||||
| elaborate (t1 $ t2) = elaborate (t1) $ elaborate (t2)
|
||||
| elaborate (Const(s,ty)) =
|
||||
if is_ISA s
|
||||
then Syntax.check_term ctxt (Const(s, ty))
|
||||
|> (fn t => transduce_term_global {mk_elaboration=true} (t , pos)
|
||||
(Proof_Context.theory_of ctxt))
|
||||
else Const(s,ty)
|
||||
| elaborate (Abs(s,ty,t)) = Abs (s,ty,elaborate t)
|
||||
| elaborate t = t
|
||||
in elaborate o strip_positions_is_ISA end
|
||||
|
||||
fun elaborate_term' ctxt parsed_term = parsed_elaborate ctxt Position.none parsed_term
|
||||
|
||||
@ -938,23 +949,26 @@ fun check_term ctxt term = transduce_term_global {mk_elaboration=false}
|
||||
(Proof_Context.theory_of ctxt)
|
||||
|
||||
(* Check an Isabelle_DOF term-antiquotation from an only parsed-term (not checked) *)
|
||||
fun parsed_check ctxt pos (Const(s,ty) $ t) =
|
||||
if is_ISA s
|
||||
then let val _ = Syntax.check_term ctxt (Const(s, ty) $ t)
|
||||
|> (fn t => transduce_term_global {mk_elaboration=false} (t , pos)
|
||||
(Proof_Context.theory_of ctxt))
|
||||
in (Const(s,ty) $ (parsed_check ctxt pos t)) end
|
||||
else (Const(s,ty) $ (parsed_check ctxt pos t))
|
||||
| parsed_check ctxt pos (t1 $ t2) = parsed_check ctxt pos (t1) $ parsed_check ctxt pos (t2)
|
||||
| parsed_check ctxt pos (Const(s,ty)) =
|
||||
if is_ISA s
|
||||
then let val _ = Syntax.check_term ctxt (Const(s, ty))
|
||||
|> (fn t => transduce_term_global {mk_elaboration=false} (t , pos)
|
||||
(Proof_Context.theory_of ctxt))
|
||||
in Const(s,ty) end
|
||||
else Const(s,ty)
|
||||
| parsed_check ctxt pos (Abs(s,ty,t)) = Abs (s,ty,parsed_check ctxt pos t)
|
||||
| parsed_check _ _ t = t
|
||||
fun parsed_check ctxt pos =
|
||||
let
|
||||
fun check (Const(s,ty) $ t) =
|
||||
if is_ISA s
|
||||
then let val _ = Syntax.check_term ctxt (Const(s, ty) $ t)
|
||||
|> (fn t => transduce_term_global {mk_elaboration=false} (t , pos)
|
||||
(Proof_Context.theory_of ctxt))
|
||||
in (Const(s,ty) $ (check t)) end
|
||||
else (Const(s,ty) $ (check t))
|
||||
| check (t1 $ t2) = check (t1) $ check (t2)
|
||||
| check (Const(s,ty)) =
|
||||
if is_ISA s
|
||||
then let val _ = Syntax.check_term ctxt (Const(s, ty))
|
||||
|> (fn t => transduce_term_global {mk_elaboration=false} (t , pos)
|
||||
(Proof_Context.theory_of ctxt))
|
||||
in Const(s,ty) end
|
||||
else Const(s,ty)
|
||||
| check (Abs(s,ty,t)) = Abs (s,ty,check t)
|
||||
| check t = t
|
||||
in check o strip_positions_is_ISA end
|
||||
|
||||
fun check_term' ctxt parsed_term = parsed_check ctxt Position.none parsed_term
|
||||
|
||||
@ -1050,15 +1064,15 @@ fun trace_attr_t cid oid =
|
||||
|
||||
\<comment> \<open>and others in the future : file, http, thy, ...\<close>
|
||||
|
||||
datatype "typ" = Isabelle_DOF_typ string ("@{typ _}")
|
||||
datatype "term" = Isabelle_DOF_term string ("@{term _}")
|
||||
datatype "thm" = Isabelle_DOF_thm string ("@{thm _}")
|
||||
datatype "file" = Isabelle_DOF_file string ("@{file _}")
|
||||
datatype "thy" = Isabelle_DOF_thy string ("@{thy _}")
|
||||
consts Isabelle_DOF_docitem :: "string \<Rightarrow> 'a" ("@{docitem _}")
|
||||
datatype "docitem_attr" = Isabelle_DOF_docitem_attr string string ("@{docitemattr (_) :: (_)}")
|
||||
consts Isabelle_DOF_trace_attribute :: "string \<Rightarrow> (string * string) list" ("@{trace'_attribute _}")
|
||||
consts Isabelle_DOF_instances_of :: "string \<Rightarrow> 'a list" ("@{instances'_of _}")
|
||||
datatype "typ" = Isabelle_DOF_typ string (\<open>@{typ _}\<close>)
|
||||
datatype "term" = Isabelle_DOF_term string (\<open>@{term _}\<close>)
|
||||
datatype "thm" = Isabelle_DOF_thm string (\<open>@{thm _}\<close>)
|
||||
datatype "file" = Isabelle_DOF_file string (\<open>@{file _}\<close>)
|
||||
datatype "thy" = Isabelle_DOF_thy string (\<open>@{thy _}\<close>)
|
||||
consts Isabelle_DOF_docitem :: "string \<Rightarrow> 'a" (\<open>@{docitem _}\<close>)
|
||||
datatype "docitem_attr" = Isabelle_DOF_docitem_attr string string (\<open>@{docitemattr (_) :: (_)}\<close>)
|
||||
consts Isabelle_DOF_trace_attribute :: "string \<Rightarrow> (string * string) list" (\<open>@{trace'_attribute _}\<close>)
|
||||
consts Isabelle_DOF_instances_of :: "string \<Rightarrow> 'a list" (\<open>@{instances'_of _}\<close>)
|
||||
|
||||
\<comment> \<open>Dynamic setup of inner syntax cartouche\<close>
|
||||
|
||||
@ -1082,15 +1096,15 @@ ML \<open>
|
||||
let fun err () = raise TERM ("string_tr", args) in
|
||||
(case args of
|
||||
[(c as Const (@{syntax_const "_constrain"}, _)) $ Free (s, _) $ p] =>
|
||||
(case Term_Position.decode_position p of
|
||||
SOME (pos, _) => c $ f (mk_string f_mk accu (content (s, pos))) $ p
|
||||
(case Term_Position.decode_position1 p of
|
||||
SOME {pos, ...} => c $ f (mk_string f_mk accu (content (s, pos))) $ p
|
||||
| NONE => err ())
|
||||
| _ => err ())
|
||||
end;
|
||||
end;
|
||||
\<close>
|
||||
|
||||
syntax "_cartouche_string" :: "cartouche_position \<Rightarrow> _" ("_")
|
||||
syntax "_cartouche_string" :: "cartouche_position \<Rightarrow> _" (\<open>_\<close>)
|
||||
|
||||
ML\<open>
|
||||
structure Cartouche_Grammar = struct
|
||||
@ -1285,15 +1299,14 @@ fun declare_ISA_class_accessor_and_check_instance (params, doc_class_name, bind_
|
||||
fun mixfix_enclose name = name |> enclose "@{" " _}"
|
||||
val mixfix = clean_mixfix bname |> mixfix_enclose
|
||||
val mixfix' = clean_mixfix doc_class_name |> mixfix_enclose
|
||||
fun add_const (b, T, mx) =
|
||||
Sign.add_consts [(b, T, mx)] #>
|
||||
DOF_core.add_isa_transformer b
|
||||
((check_instance, elaborate_instance) |> DOF_core.make_isa_transformer)
|
||||
in
|
||||
thy |> rm_mixfix bname' mixfix
|
||||
|> Sign.add_consts [(bind, const_typ, Mixfix.mixfix mixfix)]
|
||||
|> DOF_core.add_isa_transformer bind ((check_instance, elaborate_instance)
|
||||
|> DOF_core.make_isa_transformer)
|
||||
|> Sign.add_consts [(bind', const_typ, Mixfix.mixfix mixfix')]
|
||||
|> DOF_core.add_isa_transformer bind' ((check_instance, elaborate_instance)
|
||||
|> DOF_core.make_isa_transformer)
|
||||
|
||||
|> add_const (bind, const_typ, Mixfix.mixfix mixfix)
|
||||
|> add_const (bind', const_typ, Mixfix.mixfix mixfix')
|
||||
end
|
||||
|
||||
fun elaborate_instances_of thy _ _ term_option _ =
|
||||
@ -1900,7 +1913,7 @@ fun create_and_check_docitem is_monitor {is_inline=is_inline} {define=define} bi
|
||||
without using the burden of ontology classes.
|
||||
ex: text*[sdf]\<open> Lorem ipsum @{thm refl}\<close> *)
|
||||
else let
|
||||
fun conv_attrs ((lhs, pos), rhs) = (YXML.content_of lhs,pos,"=", Syntax.parse_term (Proof_Context.init_global thy) rhs)
|
||||
fun conv_attrs ((lhs, pos), rhs) = (Protocol_Message.clean_output lhs,pos,"=", Syntax.parse_term (Proof_Context.init_global thy) rhs)
|
||||
val assns' = map conv_attrs doc_attrs
|
||||
val defaults_init = create_default_object thy binding cid_long typ
|
||||
fun conv (na, _(*ty*), parsed_term) =(Binding.name_of na, Binding.pos_of na, "=", parsed_term);
|
||||
@ -2116,7 +2129,7 @@ fun update_instance_command ((binding, cid_pos),
|
||||
val _ = if cid' = DOF_core.default_cid orelse cid = cid'
|
||||
then ()
|
||||
else error("incompatible classes:"^cid^":"^cid')
|
||||
fun conv_attrs (((lhs, pos), opn), rhs) = ((YXML.content_of lhs),pos,opn,
|
||||
fun conv_attrs (((lhs, pos), opn), rhs) = ((Protocol_Message.clean_output lhs),pos,opn,
|
||||
Syntax.parse_term (Proof_Context.init_global thy) rhs)
|
||||
val assns' = map conv_attrs doc_attrs
|
||||
val def_trans_value =
|
||||
@ -2258,9 +2271,9 @@ fun meta_args_2_latex thy sem_attrs transform_attr
|
||||
|
||||
|
||||
fun markup2string s = String.concat (List.filter (fn c => c <> Symbol.DEL)
|
||||
(Symbol.explode (YXML.content_of s)))
|
||||
(Symbol.explode (Protocol_Message.clean_output s)))
|
||||
fun ltx_of_markup ctxt s = let
|
||||
val term = (Syntax.check_term ctxt o Syntax.parse_term ctxt) s
|
||||
val term = (Syntax.check_term ctxt o Syntax.parse_term ctxt) s
|
||||
val str_of_term = ltx_of_term ctxt true term
|
||||
(* handle _ => "Exception in ltx_of_term" *)
|
||||
in
|
||||
@ -2270,7 +2283,7 @@ fun meta_args_2_latex thy sem_attrs transform_attr
|
||||
val ctxt = Proof_Context.init_global thy
|
||||
val actual_args = map (fn ((lhs,_),rhs) => (toLong lhs, ltx_of_markup ctxt rhs))
|
||||
attr_list
|
||||
val default_args =
|
||||
val default_args =
|
||||
(DOF_core.get_attribute_defaults cid_long thy)
|
||||
|> map (fn (b,_, parsed_term) =>
|
||||
(toLong (Long_Name.base_name ( Sign.full_name thy b))
|
||||
@ -2560,13 +2573,13 @@ fun get_positions ctxt x =
|
||||
fun get Cs (Const ("_type_constraint_", C) $ t) = get (C :: Cs) t
|
||||
| get Cs (Free (y, T)) =
|
||||
if x = y then
|
||||
map_filter Term_Position.decode_positionT
|
||||
maps Term_Position.decode_positionT
|
||||
(T :: map (Type.constraint_type ctxt) Cs)
|
||||
else []
|
||||
| get _ (t $ u) = get [] t @ get [] u
|
||||
| get _ (Abs (_, _, t)) = get [] t
|
||||
| get _ _ = [];
|
||||
in get [] end;
|
||||
in map #pos o get [] end;
|
||||
|
||||
fun dummy_frees ctxt xs tss =
|
||||
let
|
||||
@ -3117,7 +3130,7 @@ fun add_doc_class_cmd overloaded (raw_params, binding)
|
||||
val params = map (apsnd (Typedecl.read_constraint ctxt)) raw_params;
|
||||
val ctxt1 = fold (Variable.declare_typ o TFree) params ctxt;
|
||||
fun markup2string s = String.concat (List.filter (fn c => c <> Symbol.DEL)
|
||||
(Symbol.explode (YXML.content_of s)))
|
||||
(Symbol.explode (Protocol_Message.clean_output s)))
|
||||
val name' =
|
||||
case raw_parent of
|
||||
NONE => DOF_core.default_cid
|
||||
@ -3230,8 +3243,8 @@ fun add_onto_morphism classes_mappings eqs thy =
|
||||
val converts =
|
||||
map (fn (oclasses, dclass) =>
|
||||
let
|
||||
val oclasses_string = map YXML.content_of oclasses
|
||||
val dclass_string = YXML.content_of dclass
|
||||
val oclasses_string = map Protocol_Message.clean_output oclasses
|
||||
val dclass_string = Protocol_Message.clean_output dclass
|
||||
val const_sub_name = dclass_string
|
||||
|> (oclasses_string |> fold_rev (fn x => fn y => x ^ "_" ^ y))
|
||||
|> String.explode |> map (fn x => "\<^sub>" ^ (String.str x)) |> String.concat
|
||||
|
||||
@ -41,15 +41,15 @@ text\<open> The implementation of the monitoring concept follows the following d
|
||||
|
||||
section\<open>Monitor Syntax over RegExp - constructs\<close>
|
||||
|
||||
notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
|
||||
notation Plus (infixr "||" 55)
|
||||
notation Times (infixr "~~" 60)
|
||||
notation Atom ("\<lfloor>_\<rfloor>" 65)
|
||||
notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
|
||||
notation Plus (infixr \<open>||\<close> 55)
|
||||
notation Times (infixr \<open>~~\<close> 60)
|
||||
notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
|
||||
|
||||
definition rep1 :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrace>(_)\<rbrace>\<^sup>+")
|
||||
definition rep1 :: "'a rexp \<Rightarrow> 'a rexp" (\<open>\<lbrace>(_)\<rbrace>\<^sup>+\<close>)
|
||||
where "\<lbrace>A\<rbrace>\<^sup>+ \<equiv> A ~~ \<lbrace>A\<rbrace>\<^sup>*"
|
||||
|
||||
definition opt :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrakk>(_)\<rbrakk>")
|
||||
definition opt :: "'a rexp \<Rightarrow> 'a rexp" (\<open>\<lbrakk>(_)\<rbrakk>\<close>)
|
||||
where "\<lbrakk>A\<rbrakk> \<equiv> A || One"
|
||||
|
||||
value "Star (Conc(Alt (Atom(CHR ''a'')) (Atom(CHR ''b''))) (Atom(CHR ''c'')))"
|
||||
@ -302,12 +302,12 @@ lemma mono_rep1_star':"L\<^sub>s\<^sub>u\<^sub>b (a) \<subseteq> L\<^sub>s\<^sub
|
||||
using mono_Star' rep1_star_incl' by blast
|
||||
|
||||
|
||||
no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
|
||||
no_notation Plus (infixr "||" 55)
|
||||
no_notation Times (infixr "~~" 60)
|
||||
no_notation Atom ("\<lfloor>_\<rfloor>" 65)
|
||||
no_notation rep1 ("\<lbrace>(_)\<rbrace>\<^sup>+")
|
||||
no_notation opt ("\<lbrakk>(_)\<rbrakk>")
|
||||
no_notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
|
||||
no_notation Plus (infixr \<open>||\<close> 55)
|
||||
no_notation Times (infixr \<open>~~\<close> 60)
|
||||
no_notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
|
||||
no_notation rep1 (\<open>\<lbrace>(_)\<rbrace>\<^sup>+\<close>)
|
||||
no_notation opt (\<open>\<lbrakk>(_)\<rbrakk>\<close>)
|
||||
|
||||
ML\<open>
|
||||
structure RegExpInterface_Notations =
|
||||
|
||||
@ -23,7 +23,7 @@ begin
|
||||
|
||||
|
||||
(*<*)
|
||||
definition combinator1 :: "'a \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> 'b" (infixl "|>" 65)
|
||||
definition combinator1 :: "'a \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> 'b" (infixl \<open>|>\<close> 65)
|
||||
where "x |> f = f x"
|
||||
|
||||
|
||||
|
||||
@ -190,7 +190,7 @@ onto_morphism (acm) to elsevier
|
||||
text\<open>Here is a more basic, but equivalent definition for the other way round:\<close>
|
||||
|
||||
definition elsevier_to_acm_morphism :: "elsevier_org \<Rightarrow> acm_org"
|
||||
("_ \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r" [1000]999)
|
||||
(\<open>_ \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r\<close> [1000]999)
|
||||
where "\<sigma> \<langle>acm\<rangle>\<^sub>e\<^sub>l\<^sub>s\<^sub>e\<^sub>v\<^sub>i\<^sub>e\<^sub>r = \<lparr> acm_org.tag_attribute = 1::int,
|
||||
acm_org.position = ''no position'',
|
||||
acm_org.institution = organization \<sigma>,
|
||||
@ -318,19 +318,19 @@ the presentation of them) is very different. Besides, morphisms functions can be
|
||||
(\<^ie> surjective), ``embedding'' (\<^ie> injective) or even ``one-to-one'' ((\<^ie> bijective).\<close>
|
||||
|
||||
definition Item_to_Resource_morphism :: "Item \<Rightarrow> Resource"
|
||||
("_ \<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m" [1000]999)
|
||||
(\<open>_ \<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m\<close> [1000]999)
|
||||
where " \<sigma> \<langle>Resource\<rangle>\<^sub>I\<^sub>t\<^sub>e\<^sub>m =
|
||||
\<lparr> Resource.tag_attribute = 1::int ,
|
||||
Resource.name = name \<sigma> \<rparr>"
|
||||
|
||||
definition Product_to_Resource_morphism :: "Product \<Rightarrow> Resource"
|
||||
("_ \<langle>Resource\<rangle>\<^sub>P\<^sub>r\<^sub>o\<^sub>d\<^sub>u\<^sub>c\<^sub>t" [1000]999)
|
||||
(\<open>_ \<langle>Resource\<rangle>\<^sub>P\<^sub>r\<^sub>o\<^sub>d\<^sub>u\<^sub>c\<^sub>t\<close> [1000]999)
|
||||
where " \<sigma> \<langle>Resource\<rangle>\<^sub>P\<^sub>r\<^sub>o\<^sub>d\<^sub>u\<^sub>c\<^sub>t =
|
||||
\<lparr> Resource.tag_attribute = 2::int ,
|
||||
Resource.name = name \<sigma> \<rparr>"
|
||||
|
||||
definition Computer_Software_to_Software_morphism :: "Computer_Software \<Rightarrow> Software"
|
||||
("_ \<langle>Software\<rangle>\<^sub>S\<^sub>o\<^sub>f\<^sub>t\<^sub>C\<^sub>m\<^sub>p" [1000]999)
|
||||
(\<open>_ \<langle>Software\<rangle>\<^sub>S\<^sub>o\<^sub>f\<^sub>t\<^sub>C\<^sub>m\<^sub>p\<close> [1000]999)
|
||||
where "\<sigma> \<langle>Software\<rangle>\<^sub>S\<^sub>o\<^sub>f\<^sub>t\<^sub>C\<^sub>m\<^sub>p =
|
||||
\<lparr> Resource.tag_attribute = 3::int ,
|
||||
Resource.name = name \<sigma> ,
|
||||
@ -339,7 +339,7 @@ definition Computer_Software_to_Software_morphism :: "Computer_Software \<Righta
|
||||
Software.version = version \<sigma> \<rparr>"
|
||||
|
||||
definition Electronic_Component_to_Component_morphism :: "Electronic_Component \<Rightarrow> Component"
|
||||
("_ \<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p" [1000]999)
|
||||
(\<open>_ \<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p\<close> [1000]999)
|
||||
where "\<sigma> \<langle>Component\<rangle>\<^sub>E\<^sub>l\<^sub>e\<^sub>c\<^sub>C\<^sub>m\<^sub>p =
|
||||
\<lparr> Resource.tag_attribute = 4::int ,
|
||||
Resource.name = name \<sigma> ,
|
||||
@ -348,7 +348,7 @@ definition Electronic_Component_to_Component_morphism :: "Electronic_Component \
|
||||
Component.mass = mass \<sigma> \<rparr>"
|
||||
|
||||
definition Monitor_to_Hardware_morphism :: "Monitor \<Rightarrow> Hardware"
|
||||
("_ \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e" [1000]999)
|
||||
(\<open>_ \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e\<close> [1000]999)
|
||||
where "\<sigma> \<langle>Hardware\<rangle>\<^sub>C\<^sub>o\<^sub>m\<^sub>p\<^sub>u\<^sub>t\<^sub>e\<^sub>r\<^sub>H\<^sub>a\<^sub>r\<^sub>d\<^sub>w\<^sub>a\<^sub>r\<^sub>e =
|
||||
\<lparr> Resource.tag_attribute = 5::int ,
|
||||
Resource.name = name \<sigma> ,
|
||||
@ -390,10 +390,10 @@ subsection\<open>Proving Monitor-Refinements\<close>
|
||||
|
||||
(*<*)
|
||||
(* switch on regexp syntax *)
|
||||
notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
|
||||
notation Plus (infixr "||" 55)
|
||||
notation Times (infixr "~~" 60)
|
||||
notation Atom ("\<lfloor>_\<rfloor>" 65)
|
||||
notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
|
||||
notation Plus (infixr \<open>||\<close> 55)
|
||||
notation Times (infixr \<open>~~\<close> 60)
|
||||
notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
|
||||
(*>*)
|
||||
|
||||
|
||||
@ -442,10 +442,10 @@ of the above language refinement is quasi automatic. This proof is also part of
|
||||
(*<*)
|
||||
|
||||
(* switch off regexp syntax *)
|
||||
no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
|
||||
no_notation Plus (infixr "||" 55)
|
||||
no_notation Times (infixr "~~" 60)
|
||||
no_notation Atom ("\<lfloor>_\<rfloor>" 65)
|
||||
no_notation Star (\<open>\<lbrace>(_)\<rbrace>\<^sup>*\<close> [0]100)
|
||||
no_notation Plus (infixr \<open>||\<close> 55)
|
||||
no_notation Times (infixr \<open>~~\<close> 60)
|
||||
no_notation Atom (\<open>\<lfloor>_\<rfloor>\<close> 65)
|
||||
|
||||
end
|
||||
(*>*)
|
||||
(*>*)
|
||||
|
||||
@ -614,7 +614,7 @@ subsubsection\<open>Examples\<close>
|
||||
text\<open>
|
||||
\<^item> common short-cut hiding \<^LaTeX> code in the integrated source:
|
||||
@{theory_text [display]
|
||||
\<open>define_shortcut* eg \<rightleftharpoons> \<open>\eg\<close> (* Latin: „exempli gratia“ meaning „for example“. *)
|
||||
\<open>define_shortcut* eg \<rightleftharpoons> \<open>\eg\<close> (* Latin: "exempli gratia" meaning "for example". *)
|
||||
clearpage \<rightleftharpoons> \<open>\clearpage{}\<close>\<close>}
|
||||
\<^item> non-checking macro:
|
||||
@{theory_text [display]
|
||||
|
||||
Loading…
Reference in New Issue
Block a user