13

What industry known software safety standards has anyone had experience in having to adhere to while developing software that is involved in controlling a device/system that has potential to harm the individuals using it?

user68472
  • 139
  • 3
  • any malfunction (error state) must stop the device and require a human action to turn it on again. – dusoft Feb 19 '09 at 16:00
  • Possibly an exact duplicate of http://stackoverflow.com/questions/142722/coding-for-high-reliability-availability-security-what-standards-do-i-read - but that one never satisfied me, so hopefully this one will come up with better results... – Adam Davis Feb 19 '09 at 16:10

6 Answers6

5

The Aonix link above is a good one for the basic reasoning. If you want examples of how particular standards work, you can google for the following:

  • IEC61508: “Functional safety of electrical / electronic / programmable electronic safety-related systems (E/E/PES)”. Base standard that is referenced in various sector specific standards. This IEC page is a good starting point. Part 61508-3 is about software.
  • DO-178B: Avionics standard with similar scope as IEC61508 that takes a slightly different view on software
  • IEC60601-1-4: Deals with "Programmable electrical medical systems" (Part of the 60601 series of standards)
  • EN5012x: Railway specific standards - 50128 is about software.
2

Different industries have different standards. Aircraft and robotics for example. Standards are still evolving in some new industries, such as surgical robots. Still there are some common elements. The first is redundancy. I work with industrial robots. For critical items such as speed control, we use three different calculations on two different controllers. For emergency stop systems we use dual circuits, every red e-stop button has two physical switches behind it.

I used to work on aircraft. On aircraft autopilots there are often two or even three separate computers doing the calculations and comparing results.

The goal is to prevent any single failure from making the system unsafe.

You need to look at the rules and regulations that govern the area you are working in to see what the legal requirements are, but you have to make the entire system safe.

Jim C
  • 4,981
  • 21
  • 25
1

Approaches vary by field. (sorry, I only rate 1 link...)

Avionics: DO-178B: (see wikipedia)

Information Security The Common Criteria (see the Common Criteria Portal website)

The FDA controls software for medical devices (think X-ray machines) http://www.fda.gov/medicaldevices/deviceregulationandguidance/guidancedocuments/default.htm

The safety-critical mailing list (UK) at U of York in the UK is an excellent resource There has been discussion there recently of how to apply the SIL standards (see wikipedia on SIL standards) to software systems.

1

MISRA is the standard followed in the automotive industry, but it's a coding standard to ensure correctness of operation and portability.

You need to read up on the Therac accidents to understand the complexity of this issue.

Also, NASA and military documents are widely available and discuss standards for coding that allow determination of safety of the system.

-Adam

Adam Davis
  • 91,931
  • 60
  • 264
  • 330
0

Not a saftey standard, but reading comp.risks for a while will show the kind of problems you will need to work hard to prevent.

(The book, "Computer Related Risks" is very good as well.)

Richard
  • 106,783
  • 21
  • 203
  • 265
-8

Software should never run a system that could injure someone. The only possible (and still questionable) time this might be false is when using a safety reliable PLC which is specially engineered for safety.

You should look into:

GEOCHET
  • 21,119
  • 15
  • 74
  • 98
  • 3
    Your car's traction control and ABS could both kill people, and they are being run by software. Are you saying these systems should not have been created? – Adam Davis Feb 19 '09 at 16:08
  • 1
    @Adam: Both of these systems have fail safe mechanical and electrical controls. The safety portion of them is /not/ in code. – GEOCHET Feb 19 '09 at 16:10
  • 2
    "Both of these systems have fail safe mechanical and electrical controls." False. Having signed an NDA I'm not able to provide you with references. I would be interested to see what other automotive suppliers of these components do that makes you feel they have mechanical or electrical interlocks. – Adam Davis Feb 19 '09 at 16:23
  • 1
    Or, I should say, have mechanical or electrical interlocks that prevent the possibility of the software causing a life threatening problem. They do, of course, have many protections in all realms (code, mechanical, electrical) but the software is still critically capable of harm. – Adam Davis Feb 19 '09 at 16:24
  • 2
    Let's not forget medical device software, which is perfectly capable of killing (and has). How about robot control software (again, robots have killed)? If we shunned all software-controlled potentially lethal things, we'd lose big-time. – David Thornley Feb 19 '09 at 16:30
  • @Adam: If your ABS or Traction control system failing makes the brakes not work or causes a crash, then you have deep issues. – GEOCHET Feb 19 '09 at 16:30
  • @David: All of those systems should (and typically do) have electrical or mechanical safety mechanisms. E-Stop buttons, gates, trip wire switches, light curtains, sanity checks, etc. – GEOCHET Feb 19 '09 at 16:31
  • 1
    Rich: Please comment on http://www.nasa.gov/centers/dryden/news/FactSheets/FS-024-DFRC.html . Are you saying this is out and out wrong? What about Apollo? While the humans could override the computer, it was in charge of the rockets. – Adam Davis Feb 19 '09 at 16:43
  • 1
    "Software should never run a system that could injure someone" is shortsighted at best. Like it or not, we're coding things that were mechanized in the past, and in some cases this increases safety. The OP is valid - what should we be be doing to make sure our code is safe? – Adam Davis Feb 19 '09 at 16:45
  • @Adam: And my point is that the safety of a human should never be left solely to a computer or it's software. Plain and simple. You explain to me how you would write code that meets SIL4 and Category 3 compliance and we can have a discussion. – GEOCHET Feb 19 '09 at 17:06
  • 2
    If you're going to say that there should always be a manual override or stop switch, that's at least a bit more reasonable. They don't always work or are used, of course, which is why (say) software-controlled radiation treatment machines have killed. – David Thornley Feb 20 '09 at 19:50
  • Software itself should be made more reliable. Proper selection of programming paradigm goes a long way. A declarative or logic language (Prolog for example) would make it trivial to guarantee that a robot does not kill a human. Command would be like "moving the knife up will cut a human" & "cutting human will kill a human" && "we do not kill humans" >therefore> decision is do not move knife up. – Ryan Jan 29 '20 at 05:58