Critical System
Definition of Critical Systems
A critical system is a system in which if a failure occurs, it can result in large economic losses, physical damage or threaten human life.
3 Main Types of Critical Systems
1. Critical System in terms of Safety A system that fails to cause injury, death or environmental damage.
2. Critical system in terms of mission. Failure can result in failure of an activity directed at a goal
3. System Critical in terms of Business Failure can result in failure of businesses that use the system.
Critical System Dependability
Properties of the system Same with trustworthiness It is the degree of user confidence that the system will operate as they expect. The system will not fail under normal use. Dependability dimension:
Availability (Availability) The probability that the system can work and provide a useful service at any time.
Reliability (Reliability)The probability that within a certain period of time the system will provide services correctly according to user expectations.
Safety (Safety) An assessment of how likely the system is to survive against people and the system environment.
Security (Security) An assessment of how likely it is that the system can withstand deliberate or accidental interference.
There are 3 dimensions of dependability that apply
Availability: A system must be in place to deliver insulin when needed
Reliability: The system must work reliably and deliver the right amount of insulin.
Safety: System failure can result in excessive dosing which is life threatening to the patient.
Security
Security is an assessment of the extent to which the system protects itself from external attacks that are intentional or not. Security specification process stage:
Identification and evaluation of assets (data and programs).
Threat analysis and risk assessment.
Threat classification.
Technology analysis.
Critical System Specifications
Because the potential cost of system failure is high, it is important to ensure that critical system specifications are of high quality and accurately reflect the actual needs of system users.
Anticipation and Tolerance Anticipation:
Minimization of errors Error-free OT is the exact OT following the specifications.
However, an OT that is free from errors is not necessarily free from failure. Error Avoidance.
Information Concealment Program components should be allowed access only to the data they need for implementation.
Hiding information will result in hidden info not being tampered with by program components that are not supposed to use it.
Fault Tolerance:
The goal is to ensure that system errors do not result in system failure. It is necessary in situations where system failure can cause serious accidents.
Or the loss of system operation will cause a large economic loss. Error free does not mean failure free.
Fault tolerance aspects:
Error detection.
Damage assessment.
Damage recovery.
Fixes errors.
Critical System Validation
The V&V process must demonstrate that the system meets its specifications and that the system's services and behavior support client requirements.
So that it is necessary to add analysis and normal testing, because:
The cost of failure is much greater than that of non-critical systems.
Validation of the dependability level attribute assures the user. Over 50% of total development costs for critical PL systems to avoid costly system failures The quality of the system is influenced by the quality of the process used to develop the system.
Example: PL system failure in the event of a mission on the Ariane 5 rocket in 1996, which resulted in damage to several satellites.
0 Response to "Critical System"
Posting Komentar