Back to Cptr427Winter2010
Designing Trusted (Secure) Operating Systems
Gernal operating systems provide:
- memory protection
- file protection
- object access control
- user authentication
We say that an operating system is trusted if we have confidence that it provides these four services consistently and effectively.
5.1 What is a Trusted System
We say that software is trusted software if we know that the code has been rigorously develped and analyzed, giving us reason to trust that the code does what it is expected to do and nothing more.
Trust is based on four characteristics:
- Functional Correctness
- Enforcement of Integrity.
- Limited privilege - privilege is limited to this program and it is not leaked or passed on to other programs.
- Appropriate confidence level. That is, the examination of the program is commencerate with the degree of trust that is required to use the program.
5.2 Security Policies
A security policy is a statement of the security we expect the system to enforce.
See military example p 246.
What is the Chinese Wall Security Policy? (p 251).
Note: All of the policies given as examples in this section provide a statement delineating the expectation of the system. These rules are often and should be declaritive in nature and specify what the results should be, not how to achieve those results.
5.3 Security Models
Everyone uses Model in some way to describe, study or analyze entities, relationships or situations. In security we model for several specific purposes:
- Test a particular security policy for completeness and consistency
- Document the policy
- Conceptualize and design an implementation
- Check that an implementation meets its requirements.
Lattice Model of Access Security
The military security model is representative of a more general scheme, called a Lattice (You should understand the Lattice structure).
Bell-La Padula Confidentiality Model
The Bell and La Padula model is a forma description of the allowable paths of information flow in a secure system. The model's goal is to identify allowable communication when maintaining secrecy is important.
Consider the following example:
Let $$S$$ be a set of subjects and $$O$$ be a set of objects. Each $$s \in S$$ and $$o \in O$$ has a fixed security class $$C(s)$$ and $$C(o)$$ denoting clearance and classification level. The security classes are ordered by a relation $$\le$$. (Note: The classes may form a lattice, even though the Bell-La Padula model can apply to posets)
Two properties characterize the secure flow of information.
Simple Security Property. A subject $$s$$ may have read access to an object $$o$$ only if $$C(o) \le C(s)$$.
*-Property (called the "star property"). A subject $$s$$ who has read access to an object $$o$$ may have write access to an object $$p$$ only if $$C(o) \le C(p)$$.
Example Applying the first property to the military classification model we could state that the subject (or person) must posses as high or higher a clearance than the object they access. The second property applied to a the military classification model could $$m$$
5.4 Trusted Operating System Design
Security Features of an ordinary OS
- User authentication
- Memory protection
- File and I/O device access control
- Allocation and access control to general objects
- Enforced sharing (integrity and consistency etc.)
- IPC and synchronization
- Protection of OS data
Additional features of a Trusted OS
- User identification and authentication
- Mandatory access control
- discretionary access control
- object reuse protection
- trusted path
- audit
- audit log reduction (or reuse - snort actually reduces its log by showing only one occurance of a common error.)
- intrusion detection
Kernelized Design
The kernel is the part of the operating system that performs the lowest-level functions. A security kernel is responsible for enforcing the security mechanisms of the entire OS.
- Reference monitor (Figure 5-12 p. 275) is that portion of the kernel that controls accesses to objects.
- tamper proof
- cannot bypass
- analyzable
- Trusted Computing Base (TCB) is the name we give to anything in the trusted system necessary to enforce the security policy. See Figure 5-13/14 p 277-8 and 5-14 p 278.
We are not covering Virtualization.
One may also consider the layered approach. Layering function to build a complex system occurs frequently and some even consider it absolutely necessary. See Figure 5-20 p284 and Table 5-4 p 286.
5.5 Assurance in Trusted Operating Systems
Assurance is the art of convincing someone (for the first time) that your program or OS should be trusted. Mostly what we have today is reassurance. We start with Assurance Methods.
- Testing - one of the most common methods, but it has problems...
- Testing can demonstrate the existence of a problem, but passing tests does not demonstrate the absence of problems.
- Combinatorial explosion of cases to test.
- Test code may need to exist within the product and removing it changes the products behavior.
- Real time testing is problematic because you must track states and triggers real time.
- Penetration Testing / tiger team analysis / ethical hacking - Based on having your own hackers, try to attack the system
- Hackers may require long periods of times to find existing flaws.
- Does not solve the Combinatorial explosion problem
- Formal Verification - uses rules of mathematical logic to demonstrate that a system has certain security properties.
- In verification, the Program / OS is modeled using as assertions.
- The collection of assertions is viewed as a theorem which is then proven.
- Problems:
- Mapping a Program/OS to a model of assertions may introduce approximations that simplify or ignore conditions within the program.
- Mapping is not always possible
- In general the problem is undecidable which is why we are modeling in the first place. Going from undecidable to decidable necessarily involves some lose of expression.
Walk through the example on SIC pp. 292.3 - 294.0, this is an excellent introduction that everyone should be able to be understood.
- Evaluation by an outside source is useful and sometimes essential.
- Use a standard
- Orange Book: Trusted Computer System Evaluation Criteria (TCSEC)
- European Information Technology Security Evaluation Criteria (ITSEC) Evaluation
- Use a standard