Merge branch 'main' into Isabelle_dev

This commit is contained in:
Achim D. Brucker 2022-08-01 22:58:21 +01:00
commit 9f2e2b53a4
5 changed files with 574 additions and 424 deletions

View File

@ -19,6 +19,8 @@ theory
"Isabelle_DOF.technical_report"
begin
declare[[strict_monitor_checking=true]]
define_shortcut* dof \<rightleftharpoons> \<open>\dof\<close>
isadof \<rightleftharpoons> \<open>\isadof{}\<close>
(*>*)
title*[title::title]\<open>The CENELEC 50128 Ontology\<close>
@ -27,14 +29,34 @@ subtitle*[subtitle::subtitle]\<open>Case Study: An Odometer-Subsystem\<close>
chapter*[casestudy::technical]\<open>An Odometer-Subsystem\<close>
text\<open>
In our case study, we will follow the phases of analysis, design, and implementation of the
odometry function of a train. This software processes data from an odometer to compute the position,
speed, and acceleration of a train. This system provides the basis for many
safety critical decisions, \eg, the opening of the doors. Due to its relatively small size, it
odometry function of a train. This \<^cenelec_term>\<open>SF\<close> processes data from an odometer to compute
the position, speed, and acceleration of a train. This system provides the basis for many
safety critical decisions, \<^eg>, the opening of the doors. Due to its relatively small size, it
is a manageable, albeit realistic target for a comprehensive formal development: it covers a
physical model of the environment, the physical and architectural model of the odometer including
the problem of numerical sampling, and the boundaries of efficient computations. The interplay
between environment and measuring-device as well as the implementation problems on a platform
with limited resources makes the odometer a fairly typical safety critical embedded system.
physical model of the environment, the physical and architectural model of the odometer,
but also the \<^cenelec_term>\<open>SFRS\<close> aspects including the problem of numerical sampling and the
boundaries of efficient computations. The interplay between environment and measuring-device as
well as the implementation problems on a platform with limited resources makes the odometer a
fairly typical \<^cenelec_term>\<open>safety\<close> critical \<^cenelec_term>\<open>component\<close> of an embedded system.
The case-study is presented in form of an \<^emph>\<open>integrated source\<close> in \<^isadof> containing all four
reports from the phases:
\<^item> \<^term>\<open>software_requirements\<close>, \<^ie> the \<^onto_class>\<open>SWRS\<close>
(or long:\<^typ>\<open>software_requirements_specification\<close>(-report))
\<^item> \<^term>\<open>software_architecture_and_design\<close>, \<^ie> the \<^onto_class>\<open>SWDS\<close>
(or long: \<^typ>\<open>software_design_specification\<close>(-report))
\<^item> \<^term>\<open>component_implementation_and_testing\<close>, \<^ie> the \<^onto_class>\<open>SWADVR\<close>
(or long: \<^typ>\<open>software_architecture_and_design_verification\<close>(-report))
\<^item> \<^term>\<open>component_implementation_and_testing\<close>, \<^ie> the \<^onto_class>\<open>SWADVR\<close>
(or long: \<^typ>\<open>software_component_design_verification\<close>(-report).)
The objective of this case study is to demonstrate deep-semantical ontologoies in
software developments targeting certifications, and in particular, how \<^isadof>'s
integrated source concept permits to assure \<^cenelec_term>\<open>traceability\<close>.
\<^bold>\<open>NOTE\<close> that this case study has aspects that were actually covered by CENELEC 50126 -
the 'systems'-counterpart covering hardware aspects. Recall that the CENELEC 50128 covers
software.
Due to space reasons, we will focus on the analysis part of the integrated
document; the design and code parts will only be outlined in a final resume. The
@ -45,7 +67,8 @@ text\<open>
development.
\<close>
section\<open>System Requirements as an \<^emph>\<open>Integrated Source\<close>\<close>
section\<open>A CENELEC-conform development as an \<^emph>\<open>Integrated Source\<close>\<close>
text\<open>Accurate information of a train's location along a track is in an important prerequisite
to safe railway operation. Position, speed and acceleration measurement usually lies on a
set of independent measurements based on different physical principles---as a way to enhance
@ -74,17 +97,23 @@ text\<open>
This model is already based on several fundamental assumptions relevant for the correct
functioning of the system and for its integration into the system as a whole. In
particular, we need to make the following assumptions explicit:\vspace{-.3cm}
\<^item> that the wheel is perfectly circular with a given, constant radius,
\<^item> that the slip between the trains wheel and the track negligible,
\<^item> the distance between all teeth of a wheel is the same and constant, and
\<^item> the sampling rate of positions is a given constant.
particular, we need to make the following assumptions explicit: \<^vs>\<open>-0.3cm\<close>\<close>
text*["perfect-wheel"::assumption]
\<open>\<^item> the wheel is perfectly circular with a given, constant radius. \<^vs>\<open>-0.3cm\<close>\<close>
text*["no-slip"::assumption]
\<open>\<^item> the slip between the trains wheel and the track negligible. \<^vs>\<open>-0.3cm\<close>\<close>
text*["constant-teeth-dist"::assumption]
\<open>\<^item> the distance between all teeth of a wheel is the same and constant, and \<^vs>\<open>-0.3cm\<close>\<close>
text*["constant-sampling-rate"::assumption]
\<open>\<^item> the sampling rate of positions is a given constant.\<close>
text\<open>
These assumptions have to be traced throughout the certification process as
\<^emph>\<open>derived requirements\<close> (or, in CENELEC terminology, as \<^emph>\<open>exported constraints\<close>), which is
also reflected by their tracing throughout the body of certification documents. This may result
in operational regulations, \eg, regular checks for tolerable wheel defects. As for the
in operational regulations, \<^eg>, regular checks for tolerable wheel defects. As for the
\<^emph>\<open>no slip\<close>-assumption, this leads to the modeling of constraints under which physical
slip can be neglected: the device can only produce reliable results under certain physical
constraints (speed and acceleration limits). Moreover, the \<^emph>\<open>no slip\<close>-assumption motivates
@ -93,57 +122,53 @@ text\<open>
\<close>
subsection\<open>Capturing ``System Architecture.''\<close>
figure*["three-phase"::figure,relative_width="70",src="''figures/three-phase-odo''"]
\<open>An odometer with three sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close>.\<close>
text\<open>
\begin{figure}
\centering
\includegraphics[width=.70\textwidth]{figures/three-phase-odo}
\begin{picture}(0,0)
\put(-112,44){\includegraphics[width=.30\textwidth]{figures/odometer}}
\end{picture}
\caption{An odometer with three sensors \inlineisar{C1}, \inlineisar{C2}, and \inlineisar{C3}.}\label{three-phase}
\end{figure}
The requirements analysis also contains a sub-document \<^emph>\<open>system architecture description\<close>
(CENELEC notion) that contains technical drawing of the odometer, a timing diagrams
(see \autoref{three-phase}), and tables describing the encoding of the position
for the possible signal transitions of the sensors \inlineisar{C1}, \inlineisar{C2}, and $C3.$
The requirements analysis also contains a document \<^doc_class>\<open>SYSAD\<close>
(\<^typ>\<open>system_architecture_description\<close>) that contains technical drawing of the odometer,
a timing diagram (see \<^figure>\<open>three-phase\<close>), and tables describing the encoding of the position
for the possible signal transitions of the sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close>.
\<close>
subsection\<open>Capturing ``System Interfaces.''\<close>
text\<open>
The requirements analysis also contains a sub-document \<^emph>\<open>functions and interfaces\<close>
(CENELEC notion) describing the technical format of the output of the odometry function.
This section, \eg, specifies the output \<^emph>\<open>speed\<close> as given by a \<^verbatim>\<open>int_32\<close> to be the
``Estimation of the speed (in mm/sec) evaluated over the latest $N_{\text{avg}}$ samples''
where the speed refers to the physical speed of the train and $N_{\text{avg}}$ a parameter of the
The requirements analysis also contains a sub-document \<^doc_class>\<open>FnI\<close> (\<^typ>\<open>functions_and_interfaces\<close>)
describing the technical format of the output of the odometry function.
This section, \<^eg>, specifies the output \<^emph>\<open>speed\<close> as given by a \<^verbatim>\<open>int_32\<close> to be the
``Estimation of the speed (in mm/sec) evaluated over the latest \<open>N\<^sub>a\<^sub>v\<^sub>g\<close> samples''
where the speed refers to the physical speed of the train and \<open>N\<^sub>a\<^sub>v\<^sub>g\<close> a parameter of the
sub-system configuration. \<close>
(*<*)
declare_reference*["df-numerics-encshaft"::figure]
declare_reference*["df-numerics-encshaft"::figure]
(*>*)
subsection\<open>Capturing ``Required Performances.''\<close>
text\<open>
The given analysis document is relatively implicit on the expected precision of the measurements;
however, certain interface parameters like \inlineisar*Odometric_Position_TimeStamp*
(a counter on the number of samplings) and \inlineisar*Relative_Position* are defined by as
however, certain interface parameters like \<open>Odometric_Position_TimeStamp\<close>
(a counter on the number of samplings) and \<open>Relative_Position\<close> are defined by as
unsigned 32 bit integer. These definitions imply that exported constraints concerning the acceptable
time of service as well the maximum distance before a necessary reboot of the subsystem.
For our case-study, we assume maximum deviation of the \inlineisar*Relative_Position* to the
For our case-study, we assume maximum deviation of the \<open>Relative_Position\<close> to the
theoretical distance.
The requirement analysis document describes the physical environment, the architecture
of the measuring device, and the required format and precision of the measurements of the odometry
function as represented (see @{figure (unchecked) "df-numerics-encshaft"}).\<close>
figure*["df-numerics-encshaft"::figure,relative_width="76",src="''figures/df-numerics-encshaft''"]
\<open>Real distance vs. discrete distance vs. shaft-encoder sequence\<close>
subsection\<open>Capturing the ``Software Design Spec'' (Resume).\<close>
text\<open>
\enlargethispage{\baselineskip}
The design provides a function that manages an internal first-in-first-out buffer of
shaft-encodings and corresponding positions. Central for the design is a step-function analyzing
new incoming shaft encodings, checking them and propagating two kinds of error-states (one allowing
recovery, another one, fatal, signaling, \eg, a defect of the receiver hardware),
recovery, another one, fatal, signaling, \<^eg>, a defect of the receiver hardware),
calculating the relative position, speed and acceleration.
\<close>
@ -165,85 +190,94 @@ section\<open>Formal Enrichment of the Software Requirements Specification\<clos
text\<open>
After the \<^emph>\<open>capture\<close>-phase, where we converted/integrated existing informal analysis and design
documents as well as code into an integrated Isabelle document, we entered into the phase of
\<open>formal enrichment\<close>. For example, from the assumptions in the architecture follow
the definitions:
\begin{isar}
definition teeth_per_wheelturn::nat ("tpw") where "tpw \<equiv> SOME x. x > 0"
definition wheel_diameter::real ("w$_d$") where "w$_d$ \<equiv> SOME x. x > 0"
definition wheel_circumference::real ("w$_{\text{circ}}$") where "w$_{\text{circ}}$ \<equiv> pi * w$_d$"
definition \<delta>s$_{\text{res}}$::real where "\<delta>s$_{\text{res}}$ \<equiv> w$_{\text{circ}}$ / (2 * 3 * tpw)"
\end{isar}
Here, \inlineisar{real} refers to the real numbers as defined in the HOL-Analysis
library, which provides concepts such as Cauchy Sequences, limits,
differentiability, and a very substantial part of classical Calculus. \inlineisar{SOME} is the
Hilbert choice operator from HOL; the definitions of the model parameters admit all possible positive values as uninterpreted
constants. Our perfect-wheel assumption is translated into a calculation of the circumference of the
wheel, while \inlineisar{\<delta>s<bsub>res<esub>}, the resolution of the odometer, can be calculated
\<open>formal enrichment\<close>. For example, from the assumptions in the architecture follow
the definitions:
@{theory_text [display]\<open>
definition teeth_per_wheelturn::nat ("tpw") where "tpw \<equiv> SOME x. x > 0"
definition wheel_diameter::real ("w\<^sub>d") where "w\<^sub>d \<equiv> SOME x. x > 0"
definition wheel_circumference::real ("w\<^sub>0") where "w\<^sub>0 \<equiv> pi * w\<^sub>d"
definition \<delta>s\<^sub>r\<^sub>e\<^sub>s::real where "\<delta>s\<^sub>r\<^sub>e\<^sub>s \<equiv> w\<^sub>0 / (2 * 3 * tpw)"
\<close>}
Here, \<open>real\<close> refers to the real numbers as defined in the HOL-Analysis library, which provides
concepts such as Cauchy Sequences, limits, differentiability, and a very substantial part of
classical Calculus. \<open>SOME\<close> is the Hilbert choice operator from HOL; the definitions of the
model parameters admit all possible positive values as uninterpreted constants. Our
\<^assumption>\<open>perfect-wheel\<close> is translated into a calculation of the circumference of the
wheel, while \<open>\<delta>s\<^sub>r\<^sub>e\<^sub>s\<close>, the resolution of the odometer, can be calculated
from the these parameters. HOL-Analysis permits to formalize the fundamental physical observables:
\begin{isar}
type_synonym distance_function = "real\<Rightarrow>real"
definition Speed::"distance_function\<Rightarrow>real\<Rightarrow>real" where "Speed f \<equiv> deriv f"
definition Accel::"distance_function\<Rightarrow>real\<Rightarrow>real"
where "Accel f \<equiv> deriv (deriv f)"
\end{isar}
which permits to constrain the central observable \inlineisar|distance_function| in a
@{theory_text [display]\<open>
type_synonym distance_function = "real\<Rightarrow>real"
definition Speed::"distance_function\<Rightarrow>real\<Rightarrow>real" where "Speed f \<equiv> deriv f"
definition Accel::"distance_function\<Rightarrow>real\<Rightarrow>real" where "Accel f \<equiv> deriv (deriv f)"
\<close>}
which permits to constrain the central observable \<open>distance_function\<close> in a
way that they describe the space of ``normal behavior'' where we expect the odometer to produce
reliable measurements over a \inlineisar|distance_function df|.
reliable measurements over a \<open>distance_function df\<close> .
The essence of the physics of the train is covered by the following definition:
\begin{isar}
definition normally_behaved_distance_function :: "(real \<Rightarrow> real) \<Rightarrow> bool"
where normally_behaved_distance_function df =
( \<forall> t. df(t) \<in> \<real>$_{\ge 0}$ \<and> (\<forall> t \<in> \<real>$_{\le 0}$. df(t) = 0)
\<and> df differentiable on$_{\text{R}}$ \<and> (Speed df)differentiable on$_{\text{R}}$
\<and> (Accel df)differentiable on$_{\ensuremath{R}}$
\<and> (\<forall> t. (Speed df) t \<in> {-Speed$_{\text{Max}}$ .. Speed$_{\text{Max}}$})
\<and> (\<forall> t. (Accel df) t \<in> {-\<bar>Accel$_{\text{Max}}$\<bar> .. \<bar>Accel$_{\text{Max}}$\<bar>}))
\end{isar}
@{theory_text [display]\<open>
definition normally_behaved_distance_function :: "(real \<Rightarrow> real) \<Rightarrow> bool"
where normally_behaved_distance_function df =
( \<forall> t. df(t) \<in> \<real>\<^sub>\<ge>\<^sub>0 \<and> (\<forall> t \<in> \<real>\<real>\<^sub>\<ge>\<^sub>0. df(t) = 0)
\<and> df differentiable on \<real>\<^sub>\<ge>\<^sub>0 \<and> (Speed df)differentiable on \<real>\<^sub>\<ge>\<^sub>0$
\<and> (Accel df)differentiable on \<real>\<^sub>\<ge>\<^sub>0
\<and> (\<forall> t. (Speed df) t \<in> {Speed\<^sub>M\<^sub>i\<^sub>n .. Speed\<^sub>M\<^sub>a\<^sub>x})
\<and> (\<forall> t. (Accel df) t \<in> {Accel\<^sub>M\<^sub>i\<^sub>n .. Accel\<^sub>M\<^sub>a\<^sub>x}))
\<close>}
which constrains the distance functions in the bounds described of the informal descriptions and
states them as three-fold differentiable function in certain bounds concerning speed and acceleration.
Note that violations, in particular of the constraints on speed and acceleration, \<^emph>\<open>do\<close> occur in practice.
In such cases, the global system adapts recovery strategies that are out of the scope of our model.
Concepts like \inlineisar+shaft_encoder_state+ (a triple with the sensor values
\inlineisar{C1}, \inlineisar{C2}, \inlineisar{C3}) were formalized as types, while tables were
states them as three-fold differentiable function in certain bounds concerning speed and
acceleration. Note that violations, in particular of the constraints on speed and acceleration,
\<^emph>\<open>do\<close> occur in practice. In such cases, the global system adapts recovery strategies that are out
of the scope of our model. Concepts like \<open>shaft_encoder_state\<close> (a triple with the sensor values
\<open>C1\<close>, \<open>C2\<close>, \<open>C3\<close>) were formalized as types, while tables were
defined as recursive functions:
\enlargethispage{2\baselineskip}\begin{isar}
fun phase$_0$ :: "nat \<Rightarrow> shaft_encoder_state" where
"phase$_0$ (0) = \<lparr> C1 = False, C2 = False, C3 = True \<rparr>"
|"phase$_0$ (1) = \<lparr> C1 = True, C2 = False, C3 = True \<rparr>"
|"phase$_0$ (2) = \<lparr> C1 = True, C2 = False, C3 = False\<rparr>"
|"phase$_0$ (3) = \<lparr> C1 = True, C2 = True, C3 = False\<rparr>"
|"phase$_0$ (4) = \<lparr> C1 = False, C2 = True, C3 = False\<rparr>"
|"phase$_0$ (5) = \<lparr> C1 = False, C2 = True, C3 = True \<rparr>"
|"phase$_0$ x = phase$_0$(x - 6)"
definition Phase ::"nat\<Rightarrow>shaft_encoder_state" where Phase(x) = phase$_0$(x-1)
\end{isar}
We now define shaft encoder sequences as
translations of distance functions:
\begin{isar}
definition encoding::"distance_function\<Rightarrow>nat\<Rightarrow>real\<Rightarrow>shaft_encoder_state"
where "encoding df init$_{\text{pos}}$ \<equiv> \<lambda>x. Phase(nat\<lfloor>df(x) / \<delta>s$_{\text{res}}$\<rfloor> + init$_{\text{pos}}$)"
\end{isar}
where \inlineisar+init$_{\text{pos}}$+ is the initial position of the wheel.
\inlineisar+sampling+'s were constructed from encoding sequences over discretized time points:
\begin{isar}
definition $\!\!$sampling::"distance$\!$_function\<Rightarrow>nat\<Rightarrow>real\<Rightarrow>nat\<Rightarrow>shaft$\!$_encoder$\!$_state"
where "sampling df init$_{\text{pos}}$ \<delta>t \<equiv> \<lambda>n::nat. encoding df init$_{\text{pos}}$ (n * \<delta>t)"
\end{isar}
The sampling interval \inlineisar+\<delta>t+ (the inverse of the sampling frequency) is a critical
@{theory_text [display]\<open>
fun phase\<^sub>0 :: "nat \<Rightarrow> shaft_encoder_state" where
"phase\<^sub>0 (0) = \<lparr> C1 = False, C2 = False, C3 = True \<rparr>"
|"phase\<^sub>0 (1) = \<lparr> C1 = True, C2 = False, C3 = True \<rparr>"
|"phase\<^sub>0 (2) = \<lparr> C1 = True, C2 = False, C3 = False\<rparr>"
|"phase\<^sub>0 (3) = \<lparr> C1 = True, C2 = True, C3 = False\<rparr>"
|"phase\<^sub>0 (4) = \<lparr> C1 = False, C2 = True, C3 = False\<rparr>"
|"phase\<^sub>0 (5) = \<lparr> C1 = False, C2 = True, C3 = True \<rparr>"
|"phase\<^sub>0 x = phase\<^sub>0(x - 6)"
definition Phase ::"nat\<Rightarrow>shaft_encoder_state" where Phase(x) = phase\<^sub>0(x-1)
\<close>}
We now define shaft encoder sequences as translations of distance functions:
@{theory_text [display]\<open>
definition encoding::"distance_function\<Rightarrow>nat\<Rightarrow>real\<Rightarrow>shaft_encoder_state"
where "encoding df init\<^sub>p\<^sub>o\<^sub>s \<equiv> \<lambda>x. Phase(nat\<lfloor>df(x) / \<delta>s\<^sub>r\<^sub>e\<^sub>s\<rfloor> + init\<^sub>p\<^sub>o\<^sub>s)"
\<close>}
where \<open>init\<^sub>p\<^sub>o\<^sub>s\<close> is the initial position of the wheel.
\<open>sampling\<close>'s were constructed from encoding sequences over discretized time points:
@{theory_text [display]\<open>
definition sampling::"distance_function\<Rightarrow>nat\<Rightarrow>real\<Rightarrow>nat\<Rightarrow>shaft_encoder_state"
where "sampling df init\<^sub>p\<^sub>o\<^sub>s \<delta>t \<equiv> \<lambda>n::nat. encoding df initinit\<^sub>p\<^sub>o\<^sub>s (n * \<delta>t)"
\<close>}
parameter of the configuration of a system.
Finally, we can formally define the required performances. From the interface description
Finally, we can formally define the required performances. From the interface description
and the global model parameters such as wheel diameter, the number of teeth per wheel, the sampling
frequency etc., we can infer the maximal time of service as well the maximum distance the
device can measure.
As an example configuration, choosing 1m for
\inlineisar+w$_d$+, 100 for \inlineisar+tpw+, 80km/h \inlineisar+Speed$_{\text{Max}}$+,
\<open>w\<^sub>d\<close>, 100 for \<open>tpw\<close>, 80km/h \<open>Speed\<^sub>M\<^sub>a\<^sub>x\<close>,
and 14400Hz for the sampling frequency, results in an odometer resolution of 2.3mm,
a maximum distance of 9878km, and a maximal system up-time of 123.4 hours.
The required precision of an odometer can be defined by a constant describing
the maximally allowed difference between \inlineisar+df(n*\<delta>t)+ and
\inlineisar+sampling df init$_{\text{pos}}$ \<delta>t n+ for all \inlineisar+init$_{\text{pos}}$ \<in>{0..5}+.
the maximally allowed difference between \<open>df(n*\<delta>t)\<close> and
\<open>sampling df init\<^sub>p\<^sub>o\<^sub>s \<delta>t n\<close> for all \<open>init\<^sub>p\<^sub>o\<^sub>s \<in>{0..5}\<close>.
\<close>
(*<*)
ML\<open>val two_thirty2 = 1024 * 1024 * 1024 * 4;
@ -253,41 +287,50 @@ ML\<open>val two_thirty2 = 1024 * 1024 * 1024 * 4;
section*[verific::technical]\<open>Verification of the Software Requirements Specification\<close>
text\<open>The original documents contained already various statements that motivate certain safety
properties of the device. For example, the \inlineisar+Phase+-table excludes situations in which
all sensors \inlineisar{C1}, \inlineisar{C2}, and \inlineisar{C3} are all ``off'' or situations in
properties of the device. For example, the \<open>Phase\<close>-table excludes situations in which
all sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close> are all ``off'' or situations in
which sensors are ``on,'' reflecting a physical or electrical error in the odometer. It can be
shown by a very small Isabelle case-distinction proof that this safety requirement follows indeed from the
above definitions:
\begin{isar}
lemma Encoder_Property_1:(C1(Phase x) \<and> C2(Phase x) \<and> C3(Phase x))=False
proof (cases x)
case 0 then show ?thesis by (simp add: Phase_def)
next
case (Suc n) then show ?thesis
by(simp add: Phase_def,rule_tac n = n in cycle_case_split,simp_all)
qed
\end{isar}
for all positions \inlineisar+x+. Similarly, it is proved that the table is indeed
cyclic: \inlineisar+ phase$_0$ x = phase$_0$(x mod 6)+ and locally injective:
\inlineisar+\<forall>x<6. \<forall>y<6. phase$_0$ x = phase$_0$ y \<longrightarrow> x = y+.
These lemmas, building the ``theory of an odometer,'' culminate in a theorem
that we would like to present in more detail.
\begin{isar}
theorem minimal_sampling :
assumes * : normally_behaved_distance_function df
and ** : \<delta>t * Speed$_{\text{Max}}$ < \<delta>s$_{\text{res}}$
shows \<forall> \<delta>X\<le>\<delta>t. 0<\<delta>X \<longrightarrow>
\<exists>f. retracting (f::nat\<Rightarrow>nat) \<and>
sampling df init$_{\text{pos}}$ \<delta>X = (sampling df init$_{\text{pos}}$ \<delta>t) o f
shown by a very small Isabelle case-distinction proof that this safety requirement follows indeed
from the above definitions:
\end{isar}
This theorem states for \inlineisar+normally_behaved_distance_function+s that there is
@{theory_text [display]\<open>
lemma Encoder_Property_1:(C1(Phase x) \<and> C2(Phase x) \<and> C3(Phase x))=False
proof (cases x)
case 0 then show ?thesis by (simp add: Phase_def)
next
case (Suc n) then show ?thesis
by(simp add: Phase_def,rule_tac n = n in cycle_case_split,simp_all)
qed
\<close>}
for all positions \<open>x\<close>. Similarly, it is proved that the table is indeed cyclic:
\<open>phase\<^sub>0 x = phase\<^sub>0(x mod 6)\<close>
and locally injective:
\<open>\<forall>x<6. \<forall>y<6. phase\<^sub>0 x = phase\<^sub>0 y \<longrightarrow> x = y\<close>
These lemmas, building the ``theory of an odometer,'' culminate in a theorem
that we would like to present in more detail.
@{theory_text [display]\<open>
theorem minimal_sampling :
assumes * : normally_behaved_distance_function df
and ** : \<delta>t * Speed\<^sub>M\<^sub>a\<^sub>x < \<delta>s\<^sub>r\<^sub>e\<^sub>s
shows \<forall> \<delta>X\<le>\<delta>t. 0<\<delta>X \<longrightarrow>
\<exists>f. retracting (f::nat\<Rightarrow>nat) \<and>
sampling df init\<^sub>p\<^sub>o\<^sub>s \<delta>X = (sampling df init\<^sub>p\<^sub>o\<^sub>s \<delta>t) o f
\<close>}
This theorem states for \<open>normally_behaved_distance_function\<close>s that there is
a minimal sampling frequency assuring the safety of the measurements; samplings on
some \inlineisar$df$ gained from this minimal sampling frequency can be ``pumped up''
some \<open>df\<close> gained from this minimal sampling frequency can be ``pumped up''
to samplings of these higher sampling frequencies; they do not contain more information.
Of particular interest is the second assumption, labelled ``\inlineisar|**|,'' which
establishes a lower bound from \inlineisar+w$_{\text{circ}}$+, \inlineisar+tpw+,
\inlineisar+Speed$_{\text{Max}}$+ for the sampling frequency. Methodologically, this represents
Of particular interest is the second assumption, labelled ``\<open>**\<close>'' which
establishes a lower bound from \<open>w\<^sub>0\<close>, \<open>tpw\<close>,
\<open>Speed\<^sub>M\<^sub>a\<^sub>x\<close> for the sampling frequency. Methodologically, this represents
an exported constraint that can not be represented \<^emph>\<open>inside\<close> the design model: it means that the
computations have to be fast enough on the computing platform in order to assure that the
calculations are valid. It was in particular this exported constraint that forced us to give up
@ -301,6 +344,7 @@ standard~@{cite "bsi:50128:2014"}, 7.2.4.22 and are usually addressed in an own
\<close>
chapter*[ontomodeling::text_section]\<open>The CENELEC 50128 Ontology\<close>
text\<open>
Modeling an ontology from a semi-formal text such as~@{cite"bsi:50128:2014"} is,
like any other modeling activity, not a simple one-to-one translation of some
@ -312,125 +356,144 @@ text\<open>
section*[lhf::text_section]
\<open>Tracking Concepts and Definitions\<close>
text\<open>
\isadof is designed to annotate text elements with structured meta-information and to reference
\<^isadof> is designed to annotate text elements with structured meta-information and to reference
these text elements throughout the integrated source. A classical application of this capability
is the annotation of concepts and terms definitions---be them informal, semi-formal or formal---and
their consistent referencing. In the context of our CENELEC ontology, \eg, we can translate the
their consistent referencing. In the context of our CENELEC ontology, \<^eg>, we can translate the
third chapter of @{cite "bsi:50128:2014"} ``Terms, Definitions and Abbreviations'' directly
into our Ontology Definition Language (ODL). Picking one example out of 49, consider the definition
of the concept ``traceability'' in paragraphs 3.1.46 (a notion referenced 31 times in the standard),
which we translated directly into:
\begin{isar}
Definition*[traceability::concept]<open> degree to which relationship
can be established between two or more products of a development
process, especially those having a predecessor/successor or
master/subordinate relationship to one another. <close>
\end{isar}
In the integrated source of the odometry study, we can reference in a text element to this
of the concept \<^cenelec_term>\<open>traceability\<close> in paragraphs 3.1.46 (a notion referenced 31 times in
the standard), which we translated directly into:
@{theory_text [display]\<open>
Definition*[traceability, short_name="''traceability''"]
\<open>degree to which relationship can be established between two or more products of a
development process, especially those having a predecessor/successor or
master/subordinate relationship to one another.\<close>
\<close>}
In the integrated source of the odometry study, we can reference in a text element to this
concept as follows:
\begin{isar}
text*[...]<open> ... to assure <@>{concept traceability} for
<@>{requirement bitwiseAND}, we prove ... <close>
\end{isar}
The presentation of this document element inside \isadof is immediately hyperlinked against the
\inlineisar+Definition*+ element shown above; this serves as documentation of
@{theory_text [display]\<open>
text*[...]\<open> ... to assure <@>{cenelec_term traceability} for
<@>{requirement bitwiseAND}, we prove ... \<close>
\<close>}
\<^isadof> also uses the underlying ontology to generate the navigation markup inside the IDE, \<^ie>
the presentation of this document element inside \<^isadof> is immediately hyperlinked against the
@{theory_text \<open> Definition* \<close>}-element shown above; this serves as documentation of
the standard for the development team working on the integrated source. The PDF presentation
of such links depends on the actual configurations for the document generation; We will explain
this later.
CENELEC foresees also a number of roles, phases, safety integration levels, etc., which were
directly translated into HOL enumeration types usable in ontological concepts of ODL.
\begin{isar}
datatype role =
PM (* Program Manager *) | RQM (* Requirements Manager *)
| DES (* Designer *) | IMP (* Implementer *) |
| VER (* Verifier *) | VAL (* Validator *) | ...
datatype phase =
SYSDEV_ext (* System Development *) | SPl (* Software Planning *)
| SR (* Software Requirement *) | SA (* Software Architecture *)
| SDES (* Software Design *) | ...
\end{isar}
Similarly, we can formalize the Table A.5: Verification and Testing of @{cite "bsi:50128:2014"}:
a classification of \<^emph>\<open>verification and testing techniques\<close>:
\begin{isar}
datatype vnt_technique =
@{theory_text [display]\<open>
datatype role =
PM (* Program Manager *) | RQM (* Requirements Manager *)
| DES (* Designer *) | IMP (* Implementer *) |
| VER (* Verifier *) | VAL (* Validator *) | ...
datatype phase =
SYSDEV_ext (* System Development *) | SPl (* Software Planning *)
| SR (* Software Requirement *) | SA (* Software Architecture *)
| SDES (* Software Design *) | ...
\<close>}
Similarly, we can formalize the Table A.5: Verification and Testing of @{cite "bsi:50128:2014"}:
a classification of \<^emph>\<open>verification and testing techniques\<close>:
@{theory_text [display]\<open>
datatype vnt_technique =
formal_proof "thm list" | stat_analysis
| dyn_analysis dyn_ana_kind | ...
\end{isar}
In contrast to the standard, we can parameterize \inlineisar+formal_proof+ with a list of
theorems, an entity known in the Isabelle kernel. Here, \isadof assures for text elements
\<close>}
In contrast to the standard, we can parameterize \<open>formal_proof\<close> with a list of
theorems, an entity known in the Isabelle kernel. Here, \<^isadof> assures for text elements
annotated with theorem names, that they refer indeed to established theorems in the Isabelle
environment. Additional checks could be added to make sure that these theorems have a particular
form.
While we claim that this possibility to link to theorems (and test-results) is unique in the
world of systems attempting to assure traceability, referencing a particular (proven) theorem is
definitively not sufficient to satisfy the claimed requirement. Human evaluators will always have
to check that the provided theorem \<open>adequately\<close> represents the claim; we do not in the slightest
suggest that their work is superfluous. Our framework allows to statically check that tests or proofs
have been provided, at places where the ontology requires them to be, and both assessors and developers
can rely on this check and navigate through related information easily. It does not guarantee that
intended concepts for, \eg, safety or security have been adequately modeled.
world of systems attempting to assure \<^cenelec_term>\<open>traceability\<close>, referencing a particular
(proven) theorem is definitively not sufficient to satisfy the claimed requirement. Human
evaluators will always have to check that the provided theorem \<open>adequately\<close> represents the claim;
we do not in the slightest suggest that their work is superfluous. Our framework allows to
statically check that tests or proofs have been provided, at places where the ontology requires
them to be, and both assessors and developers can rely on this check and navigate through
related information easily. It does not guarantee that intended concepts for, \<^eg>, safety
or security have been adequately modeled.
\<close>
section*[moe::text_section]
\<open>Major Ontological Entities: Requirements and Evidence\<close>
text\<open>
We introduce central concept of a \<^emph>\<open>requirement\<close> as an ODL \inlineisar*doc_class*
based on some generic basic library \inlineisar*text_element* providing basic layout attributes.
\begin{isar}
doc_class requirement = text_element +
long_name :: "string option"
is_concerned :: "role set"
\end{isar}
where the \inlineisar*roles* are exactly the ones defined in the previous section and represent
the groups of stakeholders in the CENELEC process. Therefore, the \inlineisar+is_concerned+-attribute
allows expressing who ``owns'' this text-element. \isadof supports a role-based
presentation, \eg, different presentation styles of the
integrated source may decide to highlight, to omit, to defer into an annex, text entities
according to the role-set.
We introduce central concept of a \<^emph>\<open>requirement\<close> as an ODL \<^theory_text>\<open>doc_class\<close>
based on the generic basic library \<^doc_class>\<open>text_element\<close> providing basic layout attributes.
@{theory_text [display]\<open>
doc_class requirement = text_element +
long_name :: "string option"
is_concerned :: "role set"
\<close>}
the groups of stakeholders in the CENELEC process. Therefore, the \<open>is_concerned\<close>-attribute
allows expressing who ``owns'' this text-element. \<^isadof> supports a role-based
presentation, \<^eg>, different presentation styles of the integrated source may decide to highlight,
to omit, to defer into an annex, text entities according to the role-set.
Since ODL supports single inheritance, we can express sub-requirements and therefore a style
of requirement decomposition as advocated in GSN~@{cite "kelly.ea:goal:2004"}:
\begin{isar}
doc_class sub_requirement =
decomposes :: "requirement"
relates_to :: "requirement set"
\end{isar}\<close>
@{theory_text [display]\<open>
doc_class sub_requirement =
decomposes :: "requirement"
relates_to :: "requirement set"
\<close>}
\<close>
section*[claimsreqevidence::text_section]\<open>Tracking Claims, Derived Requirements and Evidence\<close>
text\<open>An example for making explicit implicit principles,
consider the following statement @{cite "bsi:50128:2014"}, pp. 25.:\vspace{-1.5mm}
consider the following statement @{cite "bsi:50128:2014"}, pp. 25.: \<^vs>\<open>-0.15cm\<close>
\begin{quote}\small
The objective of software verification is to examine and arrive at a judgment based on
evidence that output items (process, documentation, software or application) of a specific
development phase fulfill the requirements and plans with respect to completeness, correctness
and consistency.
\end{quote}\vspace{-1.5mm}
The terms \<^emph>\<open>judgment\<close> and \<^emph>\<open>evidence\<close> are used as a kind of leitmotif throughout the CENELEC
standard, but they are neither explained nor even listed in the general glossary. However, the
standard is fairly explicit on the \<^emph>\<open>phase\<close>s and the organizational roles that different stakeholders
should have in the process. Our version to express this key concept of judgment, \eg, by
the following concept:
\begin{isar}
doc_class judgement =
refers_to :: requirement
evidence :: "vnt_technique list"
status :: status
is_concerned :: "role set" <= "{VER,ASR,VAL}"
\end{isar}
\end{quote} \<^vs>\<open>-0.15cm\<close>
The terms \<^onto_class>\<open>judgement\<close> based on \<^term>\<open>evidence\<close> are used as a kind of leitmotif throughout
the CENELEC standard, but they are neither explained nor even listed in the general glossary.
However, the standard is fairly explicit on the \<^emph>\<open>phase\<close>s and the organizational roles that
different stakeholders should have in the process. Our version to express this key concept of
\<^onto_class>\<open>judgement\<close> , \<^eg>, by the following concept:
@{theory_text [display]\<open>
doc_class judgement =
refers_to :: requirement
evidence :: "vnt_technique list"
status :: status
is_concerned :: "role set" <= "{VER,ASR,VAL}"
\<close>}
As one can see, the role set is per default set to the verification team, the assessors and the
validation team.
There are different views possible here: an alternative would be to define \inlineisar+evidence+
as ontological concept with \inlineisar+vnt_technique+'s (rather than an attribute of judgement)
and consider the basis of judgments as a relation between requirements and relation:
\begin{isar}
doc_class judgement =
based_on :: "(requirement \<times> evidence) set"
status :: status
is_concerned :: "role set" <= "{VER,ASR,VAL}"
\end{isar}
There are different views possible here: an alternative would be to define \<^term>\<open>evidence\<close>
as ontological concept with \<^typ>\<open>vnt_technique\<close>'s (rather than an attribute of judgement)
and consider the basis of a summary containing the relation between requirements and relation:
@{theory_text [display]\<open>
doc_class summary =
based_on :: "(requirement \<times> evidence) set"
status :: status
is_concerned :: "role set" <= "{VER,ASR,VAL}"
\<close>}
More experimentation will be needed to find out what kind of ontological modeling is most
adequate for developers in the context of \isadof.
@ -442,60 +505,66 @@ text\<open>From the variety of different possibilities for adding CENELEC annota
integrated source, we will, in the following, point out three scenarios.\<close>
subsection\<open>Internal Verification of Claims in the Requirements Specification.\<close>
text\<open>In our case, the SR-team early on detected a property necessary
text\<open>In our case, the \<^term>\<open>SR\<close>-team early on detected a property necessary
for error-detection of the device (c.f. @{technical verific}):
\enlargethispage{2\baselineskip}\begin{isar}
text*[encoder_props::requirement]<open> The requirement specification team ...
C1 & C2 & C3 = 0 (bitwise logical AND operation)
C1 | C2 | C3 = 1 (bitwise logical OR operation) <close>
\end{isar}
@{theory_text [display]\<open>
text*[encoder_props::requirement]\<open> The requirement specification team identifies the property:
C1 & C2 & C3 = 0 (bitwise logical AND operation)
C1 | C2 | C3 = 1 (bitwise logical OR operation) \<close>
\<close>}
After the Isabelle proofs shown in @{technical verific}, we can either register the theorems
directly in an evidence statement:
\begin{isar}
text*[J1::judgement, refers_to="<@>{docitem <open>encoder_props<close>}",
evidence="[formal_proof[<@>{thm <open>Encoder_Property_1<close>},
<@>{thm <open>Encoder_Property_2<close>}]]"]
<open>The required encoder properties are in fact verified to be consistent
with the formalization of <@>{term "phase$_0$"}.<close>
\end{isar}
The references \inlineisar|<@>{...}|, called antiquotation, allow us not only to reference to
@{theory_text [display]\<open>
text*[J1::judgement, refers_to="@{docitem <open>encoder_props<close>}",
evidence="[formal_proof[@{thm <open>Encoder_Property_1<close>},
@{thm <open>Encoder_Property_2<close>}]]"]
\<open>The required encoder properties are in fact verified to be consistent
with the formalization of @{term "phase\<^sub>0"}.\<close>
\<close>}
The references \<open>@{...}\<close>, called antiquotation, allow us not only to reference to
formal concepts, they are checked for consistency and there are also antiquotations that
print the formally checked content (\eg, the statement of a theorem).
print the formally checked content (\<^eg>, the statement of a theorem).
\<close>
subsection\<open>Exporting Claims of the Requirements Specification.\<close>
text\<open>By definition, the main purpose of the requirement specification is the
identification of the safety requirements. As an example, we state the required precision of an
odometric function: for any normally behaved distance function \inlineisar+df+, and any representable
and valid sampling sequence that can be constructed for \inlineisar+df+, we require that the
difference between the physical distance and distance calculable from the
@{term Odometric_Position_Count} is bound by the minimal resolution of the odometer.
\begin{isar}
text*[R5::safety_requirement]<open>We can now state ... <close>
definition
Odometric_Position_Count_precise::(shaft_encoder_state list\<Rightarrow>output)\<Rightarrow>bool
where Odometric_Position_Count_precise odofunction \<equiv>
(\<forall> df. \<forall>S. normally_behaved_distance_function df
\<longrightarrow> representable S
\<longrightarrow> valid_sampling S df
\<longrightarrow> (let pos = uint(Odometric_Position_Count(odofunction S))
in \<bar>df((length S - 1)*\<delta>t$_{\text{odo}}$) - (\<delta>s$_{\text{res}}$ * pos)\<bar> \<le> \<delta>s$_{\text{res}}$))
update_instance*[R5::safety_requirement,
formal_definition:="[<@>{thm <open>Odometric_Position_Count_precise_def<close>}]"]
\end{isar}
By \inlineisar+update_instance*+, we book the property \inlineisar+Position_Count_precise_def+ as
\inlineisar+safety_requirement+, a specific sub-class of \inlineisar+requirement+s
text\<open>By definition, the main purpose of the requirement specification is the identification of
the safety requirements. As an example, we state the required precision of an odometric function:
for any normally behaved distance function \<open>df\<close>, and any representable and valid
sampling sequence that can be constructed for \<open>df\<close>, we require that the difference
between the physical distance and distance calculable from the @{term Odometric_Position_Count}
is bound by the minimal resolution of the odometer.
@{theory_text [display]\<open>
text*[R5::safety_requirement]\<open>We can now state ... \<close>
definition Odometric_Position_Count_precise :: "(shaft_encoder_state list\<Rightarrow>output)\<Rightarrow>bool"
where "Odometric_Position_Count_precise odofunction \<equiv>
(\<forall> df. \<forall>S. normally_behaved_distance_function df
\<longrightarrow> representable S
\<longrightarrow> valid_sampling S df
\<longrightarrow> (let pos = uint(Odometric_Position_Count(odofunction S))
in \<bar>df((length S - 1)*\<delta>t\<^sub>o\<^sub>d\<^sub>o) - (\<delta>s\<^sub>r\<^sub>e\<^sub>s * pos)\<bar> \<le> \<delta>s\<^sub>r\<^sub>e\<^sub>s))"
update_instance*[R5::safety_requirement,
formal_definition:="[@{thm \<open>Odometric_Position_Count_precise_def\<close>}]"]
\<close>}
By \<^theory_text>\<open>update_instance*\<close>, we book the property \<open>Position_Count_precise_def\<close> as
\<^onto_class>\<open>safety_requirement\<close>, a specific sub-class of \<^onto_class>\<open>requirement\<close>s
requesting a formal definition in Isabelle.\<close>
subsection\<open>Exporting Derived Requirements.\<close>
text\<open>Finally, we discuss the situation where the verification team discovered a critical side-condition
for a major theorem necessary for the safety requirements; this was in our development the case for
the condition labelled ``\inlineisar|**|'' in @{docitem verific}. The current CENELEC standard clearly separates
the condition labelled ``\<open>**\<close>'' in @{docitem verific}. The current CENELEC standard clearly separates
``requirement specifications'' from ``verification reports,'' which is probably motivated
by the overall concern of organizational separation and of document consistency. While this
document organization is possible in \isadof, it is in our experience often counter-productive
document organization is possible in \<^isadof>, it is in our experience often counter-productive
in practice: organizations tend to defend their documents because the impact of changes is more and more
difficult to oversee. This effect results in a dramatic development slow-down and an increase of
costs. Furthermore, these barriers exclude situations where developers perfectly know, for example,
@ -510,58 +579,58 @@ different PDF versions and for each version, document specific consistency guara
automatically enforced.
In our case study, we define this condition as predicate, declare an explanation of it as
\inlineisar+SRAC+ (CENELEC for: safety-related application condition; ontologically, this is a
derived class from \inlineisar+requirement+.) and add the definition of the predicate into the
\<^onto_class>\<open>SRAC\<close> (CENELEC for: safety-related application condition; ontologically, this is a
derived class from \<^onto_class>\<open>requirement\<close>.) and add the definition of the predicate into the
document instance as described in the previous section.\<close>
text\<open>\appendix\<close>
chapter\<open>Appendix\<close>
text\<open>
\<^item> \inlineisar|<@>{thm refl}|: @{thm refl}
\<^item> \inlineisar|<@>{thm [source] refl}|: @{thm [source] refl}
\<^item> \inlineisar|<@>{thm[mode=Rule] conjI}|: @{thm[mode=Rule] conjI}
\<^item> \inlineisar|<@>{file "mini_odo.thy"}|: @{file "mini_odo.thy"}
\<^item> \inlineisar|<@>{value "3+4::int"}|: @{value "3+4::int"}
\<^item> \inlineisar|<@>{const hd}|: @{const hd}
\<^item> \inlineisar|<@>{theory HOL.List}|: @{theory HOL.List}
\<^item> \inlineisar|<@>{term "3"}|: @{term "3"}
\<^item> \inlineisar|<@>{type bool}|: @{type bool}
\<^item> \inlineisar|<@>{term [show_types] "f x = a + x"}|: @{term [show_types] "f x = a + x"}
\<^item> \<open>@{thm refl}\<close> : @{thm refl}
\<^item> \<open>@{thm [source] refl}\<close> : @{thm [source] refl}
\<^item> \<open>@{thm[mode=Rule] conjI}\<close> : @{thm[mode=Rule] conjI}
\<^item> \<open>@{file "mini_odo.thy"}\<close> : @{file "mini_odo.thy"}
\<^item> \<open>@{value "3+4::int"}}\<close> : @{value "3+4::int"}
\<^item> \<open>@{const hd}\<close> : @{const hd}
\<^item> \<open>@{theory HOL.List}\<close> : @{theory HOL.List}
\<^item> \<open>@{term "3"}\<close> : @{term "3"}
\<^item> \<open>@{type bool}\<close> : @{type bool}
\<^item> \<open>@{thm term [show_types] "f x = a + x"}\<close> : @{term [show_types] "f x = a + x"}
\<close>
text\<open>Examples for declaration of typed doc-items "assumption" and "hypothesis",
text\<open>Examples for declaration of typed doc-classes "assumption" (sic!) and "hypothesis" (sic!!),
concepts defined in the underlying ontology @{theory "Isabelle_DOF.CENELEC_50128"}. \<close>
text*[ass2::assumption, long_name="Some ''assumption one''"] \<open> The subsystem Y is safe. \<close>
text*[hyp1::hypothesis] \<open> P not equal NP \<close>
text\<open>A real example fragment from a larger project, declaring a text-element as a
"safety-related application condition", a concept defined in the
@{theory "Isabelle_DOF.CENELEC_50128"} ontology:\<close>
text\<open>
A real example fragment from a larger project, declaring a text-element as a
"safety-related application condition", a concept defined in the
@{theory "Isabelle_DOF.CENELEC_50128"} ontology:\<close>
text*[hyp2::hypothesis]\<open>Under the assumption @{assumption \<open>ass2\<close>} we establish the following: ... \<close>
text*[ass122::SRAC, long_name="Some ''ass122''"] \<open> The overall sampling frequence of the odometer
subsystem is therefore 14 khz, which includes sampling, computing and
result communication times... \<close>
text*[ass122::SRAC, long_name="Some ''ass122''"]
\<open> The overall sampling frequence of the odometer subsystem is therefore 14 khz,
which includes sampling, computing and result communication times... \<close>
text*[ass123::SRAC] \<open> The overall sampling frequence of the odometer
subsystem is therefore 14 khz, which includes sampling, computing and
result communication times... \<close>
text*[ass123::SRAC]
\<open> The overall sampling frequence of the odometer subsystem is therefore 14 khz,
which includes sampling, computing and result communication times... \<close>
text*[ass124::EC, long_name="Some ''ass124''"] \<open> The overall sampling frequence of the odometer
subsystem is therefore 14 khz, which includes sampling, computing and
result communication times... \<close>
text*[ass124::EC, long_name="Some ''ass124''"]
\<open> The overall sampling frequence of the odometer subsystem is therefore 14 khz,
which includes sampling, computing and result communication times... \<close>
text*[t10::test_result] \<open> This is a meta-test. This could be an ML-command
that governs the external test-execution via, eg., a makefile or specific calls
to a test-environment or test-engine. \<close>
text*[t10::test_result]
\<open> This is a meta-test. This could be an ML-command that governs the external
test-execution via, \<^eg>, a makefile or specific calls to a test-environment or test-engine. \<close>
text\<open>Finally some examples of references to doc-items, i.e. text-elements with declared
meta-information and status. \<close>
text
\<open> Finally some examples of references to doc-items, i.e. text-elements
with declared meta-information and status. \<close>
text \<open> As established by @{test_result (unchecked) \<open>t10\<close>},
@{test_result (define) \<open>t10\<close>} \<close>
text \<open> the @{test_result \<open>t10\<close>}
@ -569,10 +638,10 @@ text \<open> the @{test_result \<open>t10\<close>}
text \<open> represent a justification of the safety related applicability
condition @{SRAC \<open>ass122\<close>} aka exported constraint @{EC \<open>ass122\<close>}.\<close>
(* due to notational conventions for antiquotations, one may even write: *)
(* CRASHES WITH LaTeX backend error. *)
text \<open> represent a justification of the safety related applicability
condition \<^SRAC>\<open>ass122\<close> aka exported constraint \<^EC>\<open>ass122\<close>.\<close>
text \<open> due to notational conventions for antiquotations, one may even write:
"represent a justification of the safety related applicability
condition \<^SRAC>\<open>ass122\<close> aka exported constraint \<^EC>\<open>ass122\<close>."\<close>
(*<*)

View File

@ -77,7 +77,7 @@ check_isabelle_version() {
check_afp_entries() {
echo "* Checking availability of AFP entries:"
missing=""
required="Regular-Sets Functional-Automata"
required="Regular-Sets Functional-Automata Physical_Quantities"
for afp in $required; do
res=`$ISABELLE_TOOL build -n $afp 2>/dev/null || true`
if [ "$res" != "" ]; then

View File

@ -2446,6 +2446,8 @@ val parse_cid = Scan.lift(Parse.position Args.name)
val parse_oid' = Term_Style.parse -- parse_oid
val parse_cid' = Term_Style.parse -- parse_cid
val parse_attribute_access = (parse_oid
--| (Scan.lift @{keyword "::"})
-- Scan.lift (Parse.position Args.name))
@ -2477,7 +2479,7 @@ fun trace_attr_2_ML ctxt (oid:string,pos) =
fun compute_cid_repr ctxt cid pos =
if DOF_core.is_defined_cid_local cid ctxt then Const(cid,dummyT)
else ISA_core.err "Undefined Class Id" pos
else ISA_core.err ("Undefined Class Identifier:"^cid) pos
local
@ -2490,6 +2492,25 @@ fun pretty_trace_style ctxt (style, (oid,pos)) =
fun pretty_cid_style ctxt (style, (cid,pos)) =
Document_Output.pretty_term ctxt (style (compute_cid_repr ctxt cid pos));
(* NEW VERSION: PLEASE INTEGRATE ALL OVER : *)
fun context_position_parser parse_con (ctxt, toks) =
let val pos = case toks of
a :: _ => Token.pos_of a
| _ => @{here} \<comment> \<open>a real hack !\<close>
val (res, (ctxt', toks')) = parse_con (ctxt, toks)
in ((res,pos),(ctxt', toks')) end
val parse_cid = (context_position_parser Args.typ_abbrev)
>> (fn (Type(ss,_),pos) => (pos,ss)
|( _,pos) => ISA_core.err "Undefined Class Id" pos);
val parse_cid' = Term_Style.parse -- parse_cid
fun pretty_cid_style ctxt (style,(pos,cid)) =
(*reconversion to term in order to haave access to term print options like: short_names etc...) *)
Document_Output.pretty_term ctxt ((compute_cid_repr ctxt cid pos));
in
val _ = Theory.setup
(ML_Antiquotation.inline \<^binding>\<open>docitem\<close>
@ -2500,6 +2521,7 @@ val _ = Theory.setup
(fn (ctxt,toks) => (parse_oid >> trace_attr_2_ML ctxt) (ctxt, toks)) #>
basic_entity \<^binding>\<open>trace_attribute\<close> parse_oid' pretty_trace_style #>
basic_entity \<^binding>\<open>doc_class\<close> parse_cid' pretty_cid_style #>
basic_entity \<^binding>\<open>onto_class\<close> parse_cid' pretty_cid_style #>
basic_entity \<^binding>\<open>docitem_attribute\<close> parse_attribute_access' pretty_attr_access_style
)
end

View File

@ -15,9 +15,9 @@ chapter \<open>An Outline of a CENELEC Ontology\<close>
text\<open> NOTE: An Ontology-Model of a certification standard such as CENELEC or Common Criteria
identifies:
- notions (conceptual \emph{categories}) having \emph{instances}
- notions (conceptual \<^emph>\<open>categories\<close>) having \<^emph>\<open>instances\<close>
(similar to classes and objects),
- their subtype relation (eg., a "SRAC" is an "assumption"),
- their \<^emph>\<open>subtype\<close> relation (eg., a "SRAC" is an "assumption"),
- their syntactical structure
(for the moment: defined by regular expressions describing the
order of category instances in the overall document as a regular language)
@ -40,23 +40,78 @@ begin
*)
text\<open>We re-use the class @\<open>typ math_content\<close>, which provides also a framework for
text\<open>We re-use the class \<^typ>\<open>math_content\<close>, which provides also a framework for
semi-formal "math-alike" terminology, which we re-use by this definition.\<close>
doc_class semi_formal_content = math_content +
status :: status <= "semiformal"
mcc :: math_content_class <= "terminology"
mcc :: math_content_class
type_synonym sfc = semi_formal_content
(*>>*)
doc_class cenelec_term = semi_formal_content +
mcc :: math_content_class <= "terminology"
declare[[ Definition_default_class="semi_formal_content"]]
text\<open> in the following, all \<^theory_text>\<open>Definition*\<close> statements are interpreted as
\<^doc_class>\<open>cenelec_term\<close>s, i.e. semi-formal teminological definitions.\<close>
(*
ML\<open>
val typ_abbrev = Scan.lift
\<close>
ML\<open>
Parse.position: ( Token.T list -> 'a * Token.T list) -> ('a * Position.T) parser;
Scan.lift(Parse.position Args.name) ;
Args.name_position;
Proof_Context.read_typ_abbrev;
Args.typ_abbrev : Context.generic * Token.T list -> typ * (Context.generic * Token.T list) ;
Proof_Context.read_typ_abbrev;
(Args.typ_abbrev >> (fn Type(ss,_) => ss
| _ => error "Undefined Class Id"))
\<close>
ML\<open>
fun context_position_parser parse_con (ctxt, toks) =
let val pos = case toks of
a :: _ => Token.pos_of a
| _ => @{here} \<comment> \<open>a real hack !\<close>
val (res, (ctxt', toks')) = parse_con (ctxt, toks)
in ((res,pos),(ctxt', toks')) end
val parse_cid = (context_position_parser Args.typ_abbrev)
>> (fn (Type(ss,_),pos) => (pos,ss)
|( _,pos) => ISA_core.err "Undefined Class Id" pos);
val parse_cid' = Term_Style.parse -- parse_cid
fun pretty_cid_style ctxt (style,(pos,cid)) =
(*reconversion to term in order to haave access to term print options like: short_names etc...) *)
Document_Output.pretty_term ctxt ((AttributeAccess.compute_cid_repr ctxt cid pos));
val _ = Theory.setup (AttributeAccess.basic_entity \<^binding>\<open>Onto_class\<close> parse_cid' pretty_cid_style)
\<close>
text\<open> \<^Onto_class>\<open>cenelec_term\<close> \<close>
*)
declare[[ Definition_default_class="cenelec_term"]]
(*>>*)
text\<open> Excerpt of the BE EN 50128:2011, page 22. \<close>
section\<open>Terms and Definitions\<close>
Definition*[assessment,short_name="''assessment''"]
@ -66,90 +121,90 @@ requirements and to form a judgement as to whether the software is fit for its
Safety assessment is focused on but not limited to the safety properties of a system.\<close>
Definition*[assessor, short_name="''assessor''"]
\<open>entity that carries out an assessment\<close>
\<open>entity that carries out an assessment.\<close>
Definition*[COTS, short_name="''commercial off-the-shelf software''"]
\<open>software defined by market-driven need, commercially available and whose fitness for purpose
has been demonstrated by a broad spectrum of commercial users\<close>
has been demonstrated by a broad spectrum of commercial users.\<close>
Definition*[component]
\<open>a constituent part of software which has well-defined interfaces and behaviour
with respect to the software architecture and design and fulfils the following
criteria:
\<^enum> it is designed according to “Components” (see Table A.20);
\<^enum> it covers a specific subset of software requirements;
\<^enum> it is clearly identified and has an independent version inside the
configuration management system or is a part of a collection of components
(e. g. subsystems) which have an independent version
\<^enum> it is designed according to “Components” (see Table A.20);
\<^enum> it covers a specific subset of software requirements;
\<^enum> it is clearly identified and has an independent version inside the
configuration management system or is a part of a collection of components
(e. g. subsystems) which have an independent version
\<close>
typ "sfc"
Definition*[CMGR, short_name="''configuration manager''"]
\<open>entity that is responsible for implementing and carrying out the processes
for the configuration management of documents, software and related tools including
\<^emph>\<open>change management\<close>\<close>
\<^emph>\<open>change management\<close>.\<close>
Definition*[customer]
\<open>entity which purchases a railway control and protection system including the software\<close>
\<open>entity which purchases a railway control and protection system including the software.\<close>
Definition*[designer]
\<open>entity that analyses and transforms specified requirements into acceptable design solutions
which have the required safety integrity level\<close>
which have the required safety integrity level.\<close>
Definition*[entity]
\<open>person, group or organisation who fulfils a role as defined in this European Standard\<close>
\<open>person, group or organisation who fulfils a role as defined in this European Standard.\<close>
declare_reference*[fault]
Definition*[error]
\<open>defect, mistake or inaccuracy which could result in failure or in a deviation
from the intended performance or behaviour (cf. @{semi_formal_content (unchecked) \<open>fault\<close>}))\<close>
from the intended performance or behaviour (cf. @{cenelec_term (unchecked) \<open>fault\<close>})).\<close>
Definition*[fault]
\<open>defect, mistake or inaccuracy which could result in failure or in a deviation
from the intended performance or behaviour (cf @{semi_formal_content \<open>error\<close>})\<close>
from the intended performance or behaviour (cf @{cenelec_term \<open>error\<close>}).\<close>
Definition*[failure]
\<open>unacceptable difference between required and observed performance\<close>
\<open>unacceptable difference between required and observed performance.\<close>
Definition*[FT, short_name="''fault tolerance''"]
\<open>built-in capability of a system to provide continued correct provision of service as specified,
in the presence of a limited number of hardware or software faults\<close>
in the presence of a limited number of hardware or software faults.\<close>
Definition*[firmware]
\<open>software stored in read-only memory or in semi-permanent storage such as flash memory, in a
way that is functionally independent of applicative software\<close>
way that is functionally independent of applicative software.\<close>
Definition*[GS,short_name="''generic software''"]
\<open>software which can be used for a variety of installations purely by the provision of
application-specific data and/or algorithms\<close>
application-specific data and/or algorithms.\<close>
Definition*[implementer]
\<open>entity that transforms specified designs into their physical realisation\<close>
\<open>entity that transforms specified designs into their physical realisation.\<close>
Definition*[integration]
\<open>process of assembling software and/or hardware items, according to the architectural and
design specification, and testing the integrated unit\<close>
design specification, and testing the integrated unit.\<close>
Definition*[integrator]
\<open>entity that carries out software integration\<close>
\<open>entity that carries out software integration.\<close>
Definition*[PES :: sfc, short_name="''pre-existing software''"]
Definition*[PES, short_name="''pre-existing software''"]
\<open>software developed prior to the application currently in question, including COTS (commercial
off-the shelf) and open source software\<close>
off-the shelf) and open source software.\<close>
Definition*[OSS :: sfc, short_name="''open source software''"]
\<open>source code available to the general public with relaxed or non-existent copyright restrictions\<close>
Definition*[OS, short_name="''open source software''"]
\<open>source code available to the general public with relaxed or non-existent copyright restrictions.\<close>
Definition*[PLC, short_name="''programmable logic controller''"]
\<open>solid-state control system which has a user programmable memory for storage of instructions to
implement specific functions\<close>
implement specific functions.\<close>
Definition*[PM, short_name="''project management''"]
\<open>administrative and/or technical conduct of a project, including safety aspects\<close>
\<open>administrative and/or technical conduct of a project, including safety aspects.\<close>
Definition*[PMGR, short_name="''project manager''"]
\<open>entity that carries out project management\<close>
\<open>entity that carries out \<^cenelec_term>\<open>PM\<close>.\<close>
Definition*[reliability]
\<open>ability of an item to perform a required function under given conditions for a given period of time\<close>
@ -157,52 +212,52 @@ Definition*[reliability]
Definition*[robustness]
\<open>ability of an item to detect and handle abnormal situations\<close>
Definition*[RMGR, short_name="''requirements manager''"]
\<open>entity that carries out requirements management\<close>
Definition*[RM, short_name="''requirements management''"]
\<open>the process of eliciting, documenting, analysing, prioritising and agreeing on requirements and
then controlling change and communicating to relevant stakeholders. It is a continuous process
throughout a project\<close>
Definition*[RMGR, short_name="''requirements manager''"]
\<open>entity that carries out \<^cenelec_term>\<open>RM\<close>.\<close>
Definition*[risk]
\<open>combination of the rate of occurrence of accidents and incidents resulting in harm (caused by
a hazard) and the degree of severity of that harm\<close>
a hazard) and the degree of severity of that harm.\<close>
Definition*[safety]
\<open>freedom from unacceptable levels of risk of harm to people\<close>
\<open>freedom from unacceptable levels of risk of harm to people.\<close>
Definition*[SA, short_name="''safety authority''"]
\<open>body responsible for certifying that safety related software or services comply with relevant
statutory safety requirements\<close>
statutory safety requirements.\<close>
Definition*[SF, short_name="''safety function''"]
\<open>a function that implements a part or whole of a safety requirement\<close>
\<open>a function that implements a part or whole of a safety requirement.\<close>
Definition*[SFRS, short_name= "''safety-related software''"]
\<open>software which performs safety functions\<close>
\<open>software which performs safety functions.\<close>
Definition*[software]
\<open>intellectual creation comprising the programs, procedures, rules, data and any associated
documentation pertaining to the operation of a system\<close>
documentation pertaining to the operation of a system.\<close>
Definition*[SB, short_name="''software baseline''"]
\<open>complete and consistent set of source code, executable files, configuration files,
installation scripts and documentation that are needed for a software release. Information about
compilers, operating systems, preexisting software and dependent tools is stored as part of the
baseline. This will enable the organisation to reproduce defined versions and be the input
for future releases at enhancements or at upgrade in the maintenance phase\<close>
for future releases at enhancements or at upgrade in the maintenance phase.\<close>
Definition*[SD, short_name="''software deployment''"]
\<open>transferring, installing and activating a deliverable software baseline that has already been
released and assessed\<close>
released and assessed.\<close>
Definition*[SWLC, short_name="''software life-cycle''"]
\<open>those activities occurring during a period of time that starts when
software is conceived and ends when the software is no longer available for use. The software life
cycle typically includes a requirements phase, design phase,test phase, integration phase,
deployment phase and a maintenance phase\<close>
cycle typically includes a requirements phase, design phase, test phase, integration phase,
deployment phase and a maintenance phase.\<close>
Definition*[SWMA, short_name="''software maintainability''"]
\<open>capability of the software to be modified; to correct faults,
@ -210,11 +265,12 @@ improve performance or other attributes, or adapt it to a different environment\
Definition*[SM, short_name="''software maintenance''"]
\<open> action, or set of actions, carried out on software after deployment functionality
performance or other attributes, or adapt it with the aim of enhancing or correcting its\<close>
performance or other attributes, or adapt it with the aim of enhancing or correcting its behaviour.\<close>
Definition*[SOSIL, short_name="''software safety integrity level''"]
\<open>classification number which determines the techniques and measures that have to be applied to
software
software.
NOTE: Safety-related software has been classified into five safety integrity levels, where
0 is the lowest and 4 the highest.\<close>
@ -255,16 +311,16 @@ not obvious; a compiler that incorporates an executable run-time package into th
Definition*[traceability, short_name="''traceability''"]
\<open>degree to which relationship can be established between two or more products of a development
process, especially those having a predecessor/successor or master/subordinate relationship to one
another\<close>
another.\<close>
Definition*[validation, short_name="''validation''"]
\<open>process of analysis followed by a judgment based on evidence to
documentation, software or application) fits the user needs,in particular with respect to safety
and quality and determine whether an item (e.g. process, with emphasis on the suitability of its
operation in accordance to its purpose in its intended environment\<close>
operation in accordance to its purpose in its intended environment.\<close>
Definition*[validator, short_name="''validator''"]
\<open>entity that is responsible for the validation\<close>
\<open>entity that is responsible for the validation.\<close>
Definition*[verification, short_name="''verification''"]
\<open>process of examination followed by a judgment based on evidence that output items (process,
@ -275,13 +331,13 @@ NOTE: Verification is mostly based on document reviews
\<close>
Definition*[verifier, short_name="''verifier''"]
\<open>entity that is responsible for one or more verification activities\<close>
\<open>entity that is responsible for one or more verification activities.\<close>
chapter\<open>Software Management and Organisation\<close>
text\<open>Representing chapter 5 in @{cite "bsi:50128:2014"}.\<close>
section\<open>Organization, Roles and Responsabilities\<close>
text\<open>See also section \<^emph>\<open>Software management and organization\<close>. and Annex B and C\<close>
text\<open>See also section \<^emph>\<open>Software management and organization\<close> and Annex B and C.\<close>
text\<open>REQ role in Table C.1 is assumed to be a typo and should be RQM.\<close>
@ -300,17 +356,17 @@ datatype role = RQM \<comment> \<open>Requirements Manager\<close>
datatype phase = SYSDEV_ext \<comment> \<open> System Development Phase (external)\<close>
| SPl \<comment> \<open>Software Planning\<close>
| SR \<comment> \<open>Software Requirements\<close>
| SADES \<comment> \<open>Software Architecture and Design\<close>
| SCDES \<comment> \<open>Software Component Design\<close>
| CInT \<comment> \<open>Component Implementation and Testing\<close>
| SI \<comment> \<open>Software Integration\<close>
| SV \<comment> \<open>Overall Software Testing/Final Validation\<close>
| SCADA \<comment> \<open>Systems Configured by Application Data/Algorithms\<close>
| SD \<comment> \<open>Software Deployment\<close>
| SM \<comment> \<open>Software Maintenance\<close>
| SA \<comment> \<open>Software Assessment\<close>
| SPl \<comment> \<open>Software Planning\<close>
| SR \<comment> \<open>Software Requirements\<close>
| SADES \<comment> \<open>Software Architecture and Design\<close>
| SCDES \<comment> \<open>Software Component Design\<close>
| CInT \<comment> \<open>Component Implementation and Testing\<close>
| SI \<comment> \<open>Software Integration\<close>
| SV \<comment> \<open>Overall Software Testing/Final Validation\<close>
| SCADA \<comment> \<open>Systems Configured by Application Data/Algorithms\<close>
| SD \<comment> \<open>Software Deployment\<close>
| SM \<comment> \<open>Software Maintenance\<close>
| SA \<comment> \<open>Software Assessment\<close>
abbreviation software_planning :: "phase" where "software_planning \<equiv> SPl"
abbreviation software_requirements :: "phase" where "software_requirements \<equiv> SR"
@ -340,7 +396,7 @@ text\<open>Requirement levels specified Annex A: we use the term \<^emph>\<open>
them from the requirements specified in the standard.\<close>
datatype normative_level =
M \<comment> \<open>(Mandatory)\<close>
M \<comment> \<open>(Mandatory)\<close>
| HR \<comment> \<open>Highly Recommended\<close>
| R \<comment> \<open>Recommended\<close>
| Any \<comment> \<open>No recommendation\<close>
@ -365,14 +421,15 @@ doc_class safety_requirement = requirement +
formal_definition :: "thm list"
text\<open>The category @{emph \<open>hypothesis\<close>} is used for assumptions from the
foundational mathematical or physical domain, so for example:
\<^item> the Mordell-Lang conjecture holds,
\<^item> euklidian geometry is assumed, or
\<^item> Newtonian (non-relativistic) physics is assumed,
Their acceptance is inherently outside the scope of the model
and can only established inside certification process by
authority argument.
text\<open>
The category \<^emph>\<open>hypothesis\<close> is used for assumptions from the
foundational mathematical or physical domain, so for example:
\<^item> the Mordell-Lang conjecture holds,
\<^item> euklidian geometry is assumed, or
\<^item> Newtonian (non-relativistic) physics is assumed,
Their acceptance is inherently outside the scope of the model
and can only established inside certification process by
authority argument.
\<close>
datatype hyp_type = physical | mathematical | computational | other
@ -384,7 +441,7 @@ doc_class hypothesis = requirement +
hyp_type :: hyp_type <= physical (* default *)
text\<open> The following sub-class of security related hypothesis might comprise, for example:
\<^item> @{term "P \<noteq> NP"},
\<^item> \<^term>\<open>P \<noteq> NP\<close>,
\<^item> or the computational hardness of certain functions
relevant for cryptography (like prime-factorization).
(* speculation bu, not 50128 *)\<close>
@ -469,7 +526,7 @@ figure*[fig3::figure, relative_width="100",
\<open>Illustrative Development Lifecycle 1\<close>
text\<open>Actually, the Figure 4 in Chapter 5: Illustrative Development Lifecycle 2 is more fidele
to the remaining document: Software Architecture and Dediwn phases are merged, like in 7.3.\<close>
to the remaining document: Software Architecture and Design phases are merged, like in 7.3.\<close>
section\<open>Software Assurance related Entities and Concepts\<close>
@ -541,9 +598,10 @@ doc_class judgement =
status :: status
is_concerned :: "role set" <= "{VER,ASR,VAL}"
section\<open> Design and Test Documents \<close>
section\<open> A Classification of CENELEC Reports and Documents \<close>
doc_class cenelec_document = text_element +
(* should we rename this as "report" ??? bu *)
doc_class cenelec_report = text_element +
phase :: "phase"
sil :: "sil"
level :: "int option" <= "Some(0)"
@ -583,61 +641,61 @@ doc_class SYS_VnV_pl = external_specification +
phase :: "phase" <= "SYSDEV_ext"
type_synonym system_VnV_plan = SYS_VnV_pl
doc_class SQAP = cenelec_document +
doc_class SQAP = cenelec_report +
phase :: "phase" <= "SPl"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_sqap :: "nlvl \<sigma> = HR"
type_synonym software_quality_assurance_plan = SQAP
doc_class SQAVR = cenelec_document +
doc_class SQAVR = cenelec_report +
phase :: "phase" <= "SPl"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_sqavr :: "nlvl \<sigma> = HR"
type_synonym software_quality_assurance_verfication_report = SQAVR
doc_class SCMP = cenelec_document +
doc_class SCMP = cenelec_report +
phase :: "phase" <= "SPl"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_scmp :: "nlvl \<sigma> = HR"
type_synonym software_configuration_management_plan = SCMP
doc_class SVP = cenelec_document +
doc_class SVP = cenelec_report +
phase :: "phase" <= "SPl"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_svp :: "nlvl \<sigma> = HR"
type_synonym software_verification_plan = SVP
doc_class SVAP = cenelec_document +
doc_class SVAP = cenelec_report +
phase :: "phase" <= "SPl"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_svap :: "nlvl \<sigma> = HR"
type_synonym software_validation_plan = SVAP
doc_class SWRS = cenelec_document +
doc_class SWRS = cenelec_report +
phase :: "phase" <= "SR"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swrs :: "nlvl \<sigma> = HR"
type_synonym software_requirements_specification = SWRS
doc_class OSWTS = cenelec_document +
doc_class OSWTS = cenelec_report +
phase :: "phase" <= "SR"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_oswts :: "nlvl \<sigma> = HR"
type_synonym overall_software_test_specification = OSWTS
doc_class SWRVR = cenelec_document +
doc_class SWRVR = cenelec_report +
phase :: "phase" <= "SR"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swrvr :: "nlvl \<sigma> = HR"
type_synonym software_requirements_verification_report = SWRVR
doc_class SWAS = cenelec_document +
doc_class SWAS = cenelec_report +
phase :: "phase" <= "SADES"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swas :: "nlvl \<sigma> = HR"
type_synonym software_architecture_specification = SWAS
doc_class SWDS = cenelec_document +
doc_class SWDS = cenelec_report +
phase :: "phase" <= "SADES"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swds :: "nlvl \<sigma> = HR"
@ -654,7 +712,7 @@ doc_class SWIS_E =
type_synonym software_interface_specification_element = SWIS_E
doc_class SWIS = cenelec_document +
doc_class SWIS = cenelec_report +
phase :: "phase" <= "SADES"
nlvl :: "normative_level" <= "HR"
written_by :: "role option" <= "Some DES"
@ -664,198 +722,198 @@ doc_class SWIS = cenelec_document +
invariant force_nlvl_swis :: "nlvl \<sigma> = HR"
type_synonym software_interface_specifications = SWIS
doc_class SWITS = cenelec_document +
doc_class SWITS = cenelec_report +
phase :: "phase" <= "SADES"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swits :: "nlvl \<sigma> = HR"
type_synonym software_integration_test_specification = SWITS
doc_class SWHITS = cenelec_document +
doc_class SWHITS = cenelec_report +
phase :: "phase" <= "SADES"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swhits :: "nlvl \<sigma> = HR"
type_synonym software_hardware_integration_test_specification = SWHITS
doc_class SWADVR = cenelec_document +
doc_class SWADVR = cenelec_report +
phase :: "phase" <= "SADES"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swadvr :: "nlvl \<sigma> = HR"
type_synonym software_architecture_and_design_verification_report = SWADVR
type_synonym software_architecture_and_design_verification = SWADVR
doc_class SWCDS = cenelec_document +
doc_class SWCDS = cenelec_report +
phase :: "phase" <= "SCDES"
invariant force_nlvl_swcds :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_component_design_specification = SWCDS
doc_class SWCTS = cenelec_document +
doc_class SWCTS = cenelec_report +
phase :: "phase" <= "SCDES"
invariant force_nlvl_swcts :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_component_test_specification = SWCTS
doc_class SWCDVR = cenelec_document +
doc_class SWCDVR = cenelec_report +
phase :: "phase" <= "SCDES"
invariant force_nlvl_swcdvr :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_component_design_verification_report = SWCDVR
type_synonym software_component_design_verification = SWCDVR
doc_class SWSCD = cenelec_document +
doc_class SWSCD = cenelec_report +
phase :: "phase" <= "CInT"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swscd :: "nlvl \<sigma> = HR"
type_synonym software_source_code_and_documentation = SWSCD
doc_class SWCTR = cenelec_document +
doc_class SWCTR = cenelec_report +
phase :: "phase" <= "CInT"
invariant force_nlvl_swctr :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_component_test_report = SWCTR
doc_class SWSCVR = cenelec_document +
doc_class SWSCVR = cenelec_report +
phase :: "phase" <= "CInT"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swscvr :: "nlvl \<sigma> = HR"
type_synonym software_source_code_verification_report = SWSCVR
doc_class SWITR = cenelec_document +
doc_class SWITR = cenelec_report +
phase :: "phase" <= "SI"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_switr :: "nlvl \<sigma> = HR"
type_synonym software_integration_test_report = SWITR
doc_class SWHAITR = cenelec_document +
doc_class SWHAITR = cenelec_report +
phase :: "phase" <= "SI"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swhaitr :: "nlvl \<sigma> = HR"
type_synonym software_hardware_integration_test_report = SWHAITR
doc_class SWIVR = cenelec_document +
doc_class SWIVR = cenelec_report +
phase :: "phase" <= "SI"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swivr :: "nlvl \<sigma> = HR"
type_synonym software_integration_verification_report = SWIVR
doc_class OSWTR = cenelec_document +
doc_class OSWTR = cenelec_report +
phase :: "phase" <= "SV"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_oswtr :: "nlvl \<sigma> = HR"
type_synonym overall_software_test_report = OSWTR
doc_class SWVALR = cenelec_document +
doc_class SWVALR = cenelec_report +
phase :: "phase" <= "SV"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swvalr :: "nlvl \<sigma> = HR"
type_synonym software_validation_report = SWVALR
doc_class TVALR = cenelec_document +
doc_class TVALR = cenelec_report +
phase :: "phase" <= "SV"
invariant force_nlvl_tvalr :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym tools_validation_report = TVALR
doc_class SWVRN = cenelec_document +
doc_class SWVRN = cenelec_report +
phase :: "phase" <= "SV"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swvrn :: "nlvl \<sigma> = HR"
type_synonym software_validation_release_note = SWVRN
doc_class ARS = cenelec_document +
doc_class ARS = cenelec_report +
phase :: "phase" <= "SCADA"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_ars :: "nlvl \<sigma> = HR"
type_synonym application_requirements_specification = ARS
doc_class APP = cenelec_document +
doc_class APP = cenelec_report +
phase :: "phase" <= "SCADA"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_APP :: "nlvl \<sigma> = HR"
type_synonym application_preparation_plan = APP
doc_class ATS = cenelec_document +
doc_class ATS = cenelec_report +
phase :: "phase" <= "SCADA"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_ats :: "nlvl \<sigma> = HR"
type_synonym application_test_specification = ATS
doc_class AAD = cenelec_document +
doc_class AAD = cenelec_report +
phase :: "phase" <= "SCADA"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_aad :: "nlvl \<sigma> = HR"
type_synonym application_architecture_design = AAD
doc_class APVR = cenelec_document +
doc_class APVR = cenelec_report +
phase :: "phase" <= "SCADA"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_apvr :: "nlvl \<sigma> = HR"
type_synonym application_preparation_verification_report = APVR
doc_class ATR = cenelec_document +
doc_class ATR = cenelec_report +
phase :: "phase" <= "SCADA"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_atr :: "nlvl \<sigma> = HR"
type_synonym application_test_report= ATR
doc_class SOCOADA = cenelec_document +
doc_class SOCOADA = cenelec_report +
phase :: "phase" <= "SCADA"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_socoada :: "nlvl \<sigma> = HR"
type_synonym source_code_application_data_algorithms = SOCOADA
doc_class ADAVR = cenelec_document +
doc_class ADAVR = cenelec_report +
phase :: "phase" <= "SCADA"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_adavr :: "nlvl \<sigma> = HR"
type_synonym application_data_algorithms_verification_report= ADAVR
doc_class SWRDP = cenelec_document +
doc_class SWRDP = cenelec_report +
phase :: "phase" <= "SD"
invariant force_nlvl_swrdp :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_release_deployment_plan = SWRDP
doc_class SWDM = cenelec_document +
doc_class SWDM = cenelec_report +
phase :: "phase" <= "SD"
invariant force_nlvl_swdm :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_deployment_manual = SWDM
doc_class SWDRN = cenelec_document +
doc_class SWDRN = cenelec_report +
phase :: "phase" <= "SD"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swdrn :: "nlvl \<sigma> = HR"
type_synonym software_deployment_release_notes = SWDRN
doc_class SWDR = cenelec_document +
doc_class SWDR = cenelec_report +
phase :: "phase" <= "SD"
invariant force_nlvl_swdr :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_deployment_records = SWDR
doc_class SWDVR = cenelec_document +
doc_class SWDVR = cenelec_report +
phase :: "phase" <= "SD"
invariant force_nlvl_swdvr :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_deployment_verification_report = SWDVR
doc_class SWMP = cenelec_document +
doc_class SWMP = cenelec_report +
phase :: "phase" <= "SM"
invariant force_nlvl_swmp :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_maintenance_plan = SWMP
doc_class SWCR = cenelec_document +
doc_class SWCR = cenelec_report +
phase :: "phase" <= "SM"
nlvl :: "normative_level" <= "HR"
invariant force_nlvl_swcr :: "nlvl \<sigma> = HR"
type_synonym software_change_records = SWCR
doc_class SWMR = cenelec_document +
doc_class SWMR = cenelec_report +
phase :: "phase" <= "SM"
invariant force_nlvl_swmr :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_maintenance_records = SWMR
doc_class SWMVR = cenelec_document +
doc_class SWMVR = cenelec_report +
phase :: "phase" <= "SM"
invariant force_nlvl_swmvr :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_maintenance_verification_report = SWMVR
doc_class SWAP = cenelec_document +
doc_class SWAP = cenelec_report +
phase :: "phase" <= "SA"
invariant force_nlvl_swap :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_assessment_plan = SWAP
doc_class SWAR = cenelec_document +
doc_class SWAR = cenelec_report +
phase :: "phase" <= "SA"
invariant force_nlvl_swar :: "if sil \<sigma> = SIL0 then nlvl \<sigma> = R else nlvl \<sigma> = HR"
type_synonym software_assessment_report = SWAR

View File

@ -38,16 +38,17 @@ doc_class "code" = technical +
checked :: bool <= "False"
caption :: "string" <= "''''"
typ code
text\<open>The @{doc_class "code"} is a general stub for free-form and type-checked code-fragments
such as:
\<^enum> SML code
\<^enum> bash code
\<^enum> isar code (although this might be an unwanted concurrence to the Isabelle standard cartouche)
\<^enum> C code.
text\<open>The \<^doc_class>\<open>code\<close> is a general stub for free-form and type-checked code-fragments such as:
\<^enum> SML code
\<^enum> bash code
\<^enum> isar code (although this might be an unwanted concurrence
to the Isabelle standard cartouche)
\<^enum> C code.
it is intended that later refinements of this "stub" as done in \<^verbatim>\<open>Isabelle_C\<close> which come with their
own content checking and, of course, presentation styles.
It is intended that later refinements of this "stub" as done in \<^verbatim>\<open>Isabelle_C\<close> which come with their
own content checking and presentation styles.
\<close>
doc_class "SML" = code +