When I say "assurance" I mean the process of acquiring confidence that your box isn't going to do something really ugly if you fire it up or put it on your net. Operational, or black-box assurance, is based on the observation that a certain class of boxes hasn't killed anybody yet, so you're probably safe. Reliance on this form of assurance has led to some pretty nasty surprises. Formal, or glass-box assurance, attempts to provide confidence from some combination of design characteristics, analysis and testing. This approach is still prone to nasty surprises, but they tend to be fewer--or at least easier to explain after the fact. Most high assurance work has been done in the area of kinetic devices and infernal machines that are controlled by stupid robots. As information processing technology becomes more important to society, these concerns spread to areas previously thought inherently harmless, like operating systems. Security is the most obvious example, along with availability of service in chaotic or hostile environments.
The NSA release incorporates an idea called Type Enforcement (TE) that was cooked up by Dick Kain and myself over 15 years ago, as part of a project to investigate high assurance systems. It's intended as a design characteristic to support analysis and testing, in aid of assurance. Our retrospective paper [1] covers those aspects, so I'll concentrate here on the short and long term development work that I think needs to be done.
The link for this article located at Linux Journal is no longer available.