Javascript must be enabled to continue!
Software Safety
View through CrossRef
Abstract
This is an age of rapidly increasing technological dependence on computer‐controlled systems. Given the relative and obvious unreliability of the humble but ubiquitous PC , the reader may wonder just how safe this trend might be. In this article, the author try to answer this question.
The safety of any system is reflected in its inability to do harm. A system is unsafe if harm can result from its operation and we then call this a safety‐critical system. If a system is safe only within certain operating bounds, then these must be explored and defined. Assessing the safety of a system can be a very complex process involving many factors including but not limited to reliability, design resilience, human interaction, and interactions with other systems. As systems evolve into ever more complex forms, this just gets harder.
From this we can see that safety is a property of a system
as a whole
. In this article, only systems are addressed. By definition, any computer‐controlled system contains software of one kind or another, and computer‐controlled systems and software‐controlled systems are treated as synonymous. The software contained therein might be embedded in the system in the form of
firmware
or may be readily accessible. From the point of view of safety, however, there is no effective distinction. Software‐controlled systems are essentially treated in the same way as conventional engineering systems such as bridges, but the scale of software‐controlled systems is now much larger and many of the associated risks turn out to be essentially unquantifiable at the current state of knowledge.
Computer‐controlled systems can take many forms. In some systems, software may contribute only a small part to the overall operation and the failure of the software component may not prejudice the overall behaviour at all. In such systems, we say that software is a
noncritical
component.
In contrast, for other systems, software may contribute a major part or even the entire behavior of the system. Failure of the software in such a system prejudices the overall behavior of the system, perhaps catastrophically. In these systems, we say that software is a
critical
component.
It is getting more and more difficult to assess the safety or reliability of software controlled systems. There are essentially five contributing factors: Size, Complexity, Coupling, Ubiquity, and Sensitivity. The influence of design and implementation and standardization initiatives are discussed.
Title: Software Safety
Description:
Abstract
This is an age of rapidly increasing technological dependence on computer‐controlled systems.
Given the relative and obvious unreliability of the humble but ubiquitous PC , the reader may wonder just how safe this trend might be.
In this article, the author try to answer this question.
The safety of any system is reflected in its inability to do harm.
A system is unsafe if harm can result from its operation and we then call this a safety‐critical system.
If a system is safe only within certain operating bounds, then these must be explored and defined.
Assessing the safety of a system can be a very complex process involving many factors including but not limited to reliability, design resilience, human interaction, and interactions with other systems.
As systems evolve into ever more complex forms, this just gets harder.
From this we can see that safety is a property of a system
as a whole
.
In this article, only systems are addressed.
By definition, any computer‐controlled system contains software of one kind or another, and computer‐controlled systems and software‐controlled systems are treated as synonymous.
The software contained therein might be embedded in the system in the form of
firmware
or may be readily accessible.
From the point of view of safety, however, there is no effective distinction.
Software‐controlled systems are essentially treated in the same way as conventional engineering systems such as bridges, but the scale of software‐controlled systems is now much larger and many of the associated risks turn out to be essentially unquantifiable at the current state of knowledge.
Computer‐controlled systems can take many forms.
In some systems, software may contribute only a small part to the overall operation and the failure of the software component may not prejudice the overall behaviour at all.
In such systems, we say that software is a
noncritical
component.
In contrast, for other systems, software may contribute a major part or even the entire behavior of the system.
Failure of the software in such a system prejudices the overall behavior of the system, perhaps catastrophically.
In these systems, we say that software is a
critical
component.
It is getting more and more difficult to assess the safety or reliability of software controlled systems.
There are essentially five contributing factors: Size, Complexity, Coupling, Ubiquity, and Sensitivity.
The influence of design and implementation and standardization initiatives are discussed.
Related Results
Factors Influencing Patient Safety Management Behaviors in Nursing Students
Factors Influencing Patient Safety Management Behaviors in Nursing Students
The objective of this study is to identify the critical thinking Disposition, problem-solving processes, safety motivation, patient safety management knowledge, attitudes towards p...
Evaluating Effects of Culture and Language on Safety
Evaluating Effects of Culture and Language on Safety
This paper (SPE 54448) was revised for publication from paper SPE 48891, prepared for the 1998 SPE International Conference and Exhibition held in Beijing, 2–6 November. Original m...
Impact of Construction Safety Culture and Construction Safety Climate on Safety Behavior and Safety Motivation
Impact of Construction Safety Culture and Construction Safety Climate on Safety Behavior and Safety Motivation
The construction industry is known for its disappointing safety performance. Therefore, rethinking current safety management frameworks is crucial. This study assesses a newly prop...
Patient Safety Incident Reporting Behaviour and Associated Factor sAmong Nurses Working in Public Hospitals in Addis Ababa, Ethiopia (2024) (Preprint)
Patient Safety Incident Reporting Behaviour and Associated Factor sAmong Nurses Working in Public Hospitals in Addis Ababa, Ethiopia (2024) (Preprint)
BACKGROUND
Background
The health care delivery system is a complicated, by design and prone to errors with many medical practices and risks in the system e...
Performance simulation methodologies for hardware/software co-designed processors
Performance simulation methodologies for hardware/software co-designed processors
Recently the community started looking into Hardware/Software (HW/SW) co-designed processors as potential solutions to move towards the less power consuming and the less complex de...
Pursuit of “Absolute Battery Safety, Fear-Free Energy and Mobility” - A Technology Roadmap Toward a Fail-Never Battery Future
Pursuit of “Absolute Battery Safety, Fear-Free Energy and Mobility” - A Technology Roadmap Toward a Fail-Never Battery Future
The Pursuit of “Absolute Battery Safety, Fear-Free Energy, and Mobility”—A ”Technology Roadmap Toward a Fail-Never Battery Future
As the electrification of transportation and energ...
Software Assurance
Software Assurance
Abstract
Confidence in software quality is a rare commodity throughout all industries. Software publishers, users, and system integrators are highly distrustful of anyone...
A review on safety practitioners’ competency profiles from the employers’ perspective
A review on safety practitioners’ competency profiles from the employers’ perspective
To ensure fewer accidents and injuries to workers on construction sites, it is essential to review the recruitment practices, job duties and relevant study areas of safety practiti...

