SPECIALISTS IN INFORMATION SECURITY MANAGEMENT SYSTEMS (ISO/IEC 27001)
YOU ARE IN GAMMA’S RESEARCH ARCHIVES — THIS PAGE IS OF HISTORIC INTEREST ONLY — EXIT
(Les critères d'homologation de la sécurité des systèmes d'information)
This paper was written by Dr. David Brewer and was presented at the Eurosec 98 conference in Paris on 16 March 1998, Copyright © Gamma Secure Systems Limited, 1998.
In English, the French word "homologation" means "officially recognised". In the context of security and information technology, homologation describes a process whereby a security authority grants permission for an information technology (IT) system to be used for operational purposes with live data. The process is usually applied in the context of information confidentiality, implying that a system is not permitted to handle sensitive information until the homologation process is complete. Indeed, the process is traditionally perceived as a state transition: at some point in time a non-homologated system is instantaneously transformed into a homologated one. Homologation is often called "accreditation" in military or national security circles.
This paper investigates the evolution of homologation criteria and examines their utility in real world situations. Our conclusions force us to challenge the simple state transition model and replace it with the more radical idea that homologation is really a complex, continuous, life cycle process. We show that there is an alternative, that an appropriate marriage of emergent Common Criteria and BS7799 goes part way to solving the problem, and that real-time security management tools are a necessary ingredient for success.
Problems of Homologation
The US Orange Book was conceived in an era of computing technology when the mainframe computer was king. In a world of paper-tape and punched cards, computing was confined to the computer room and any sense of computer security was the dominion of the operating system. As far as security was concerned, the first obstacle that a spy would have to overcome would be to gain access to the site on which the computer installation was located. The spy would then have to gain access to the computer room, and then to the computer operating system itself. In this paradigm there are three distinct levels of defence: "site security", "room security" and "operating system security". The Orange Book identified a comprehensive set of security evaluation criteria. However, from a homologation perspective, they applied only to the operating system. Contemporary European work accepted the US work as a partial solution and initially concentrated on completing the picture by developing criteria to govern the site and room security. This gave rise to the concept of a "System Security Policy", or SSP that would determine the site, room and operating system safeguards necessary to protect the data held within the computer system from unauthorised access.
By the time the Orange Book was published (1985), remote mainframe access via dumb terminals was commonplace. Clearly, homologation now had to take cognisance of communication security (COMSEC) and electromagnetic radiation (TEMPEST), but the basic ideas of site, room and operating system security, and the SSP, still applied.
With hindsight we can recognise that we had just been given our first glimpse of a fundamental problem with homologation: the need for homologation criteria to keep pace with technology. We were soon to learn other factors, in particular cost and time. Indeed, as has been found with product evaluations, by the time the homologation process is complete, system requirements may have changed sufficiently to warrant restarting the homologation process all over again. There are also procurement issues, particularly concerning outsourcing and private finance initiatives.
To this day, from its beginnings in the mid-1980s, homologation has been characterised by:
The second characteristic is elegantly embraced by the ITSEC. For example, when applied in a system as opposed to a product context, the ITSEC explicitly calls up a SSP as a major component of the Security Target. The ITSEC provides criteria to test the IT within the context of the SSP but falls short of providing criteria to test the effectiveness of the physical, personnel and procedural safeguards specified by the SSP. Surprisingly, this has never been regarded as a major weakness. Technology had moved on, of course: by the time the ITSEC was published (1991) data exchange between systems had become commonplace. This gave rise to the idea of a "System Interconnection Security Policy" (SISP) to govern bilateral data exchange between systems and a Community Security Policy (CSP) when three or more systems enter into a common data exchange agreement.
A greater challenge has arisen from the growth of the domestic PC market. The ability of individuals to buy off-the-shelf IT for home use, often with greater computing power than they enjoy at work, has prompted businesses and government departments to streamline their acquisition processes. Coupled with modern risk management disciplines, this has led to an evolutionary approach to IT acquisition, with which ITSEC-based homologation processes can hardly keep pace. While the complete application of ITSEC in a system context may take 9 months or longer, a typical evolutionary approach may introduce major system upgrades, with a corresponding change in user security profiles, every three months or so.
An alternative approach might be to apply ITSEC in conjunction with a Certificate Maintenance Scheme (CMS). However, the UK CMS was developed as a way to reduce the need for product re-evaluations. It is not intended to address changes in security threats, objectives or safeguards.
A more radical alternative would be to only make use of certified Security Enforcing Functions (SEFs). In other words: firstly the security components of an IT system are constructed solely from ITSEC evaluated products and, secondly, the SSP only specifies those security features that were certified during the product evaluations. All other required security functionality is then provided by physical, personnel or procedural means. This approach has been applied successfully in several Microsoft NT-based LANs, including one for an Anglo-French-Italian project. In this case, its effect was to reduce the ITSEC part of the homologation process to the submission of a security target and effectiveness documentation to the homologation authority and obviate the need for a system evaluation.
However, none of these approaches explicitly recognise that information security is not only a technical problem, it is also a management problem. We will therefore introduce the idea of an Information Security management System (ISMS) to address the management issues. Later we will show that the ISMS is perhaps an essential component of the homologation process.
The popular uptake of the World Wide Web (WWW) has rendered computer access "user centric". Many people now enjoy a computing environment where they transparently connect to virtually any computer in the world to access information. Alternatively, people have borrowed the Internet concept to set up their own "private" intra- or extranet facilities.
From a homologation perspective there is no longer a many to one relationship between users and computer systems but a many to many relationship. Indeed, in this context, the concept of a computer system, as originally conceived by the Orange Book, is severely strained. It is perhaps wiser to think in terms of service providers and client communities. However, the impact of this approach on the homologation process is not straightforward as Brewer and Wilsher intimated at Eurosec 97. In the traditional sense, the homologation process is carried out by the data owner, which is why, in the "Insurefast" case, Insurefast's clients wanted to audit the Insurefast service before they used it. However, they are not allowed to do this in practice and a different homologation process is required. To understand why, consider the interconnections of a service provider S and three client communities A, B and C. There are three cases:
The Impact of Acquisition
Viewed simply, the homologation process is the answer to the cautious buyer's prayer. The buyer acquires a system to keep the buyer's transactions secret. From the buyer's perspective, there can be no absolute guarantee that the supplier is not, at least, the unwitting agent of the criminal or spy. The buyer therefore invokes the homologation procedure to determine whether the system is safe to use before using it.
Of course, the buyer ought to pay the supplier for the system before invoking the homologation process. However, since there is a risk that the supplier may be required to modify the system before it can pass homologation, the buyer will invariably defer payment until after successful homologation. Unfortunately, the homologation process is not under the suppliers control. Seen from the supplier's perspective, homologation could be a means to obtain use of a system without paying for it. The homologation process therefore has a tendency to cause friction between the buyer and the supplier. This friction is at its greatest when large amounts of private finance are at stake.
The Need for Real-Time Security
The homologation process need not be restricted to issues of confidentiality. Let us introduce the notion of information quality to capture the concepts of fitness-for-purpose and the timeliness of information. From a management perspective, the quality of information is often its fitness for the purpose of making an informed decision, and can be measured by the ability of the manager to spot mistakes and correct them before the aforementioned decision has been made. As the homologation process can tell people that their systems will keep their secrets secret, when applied in the context of information quality, the homologation process will tell them that their systems can be relied upon to help them spot mistakes.
The ability to spot mistakes is, of course, a real-time activity; the collapse of Barings being a classic counter-example. Good management requires the continuous monitoring of business information. Likewise, in the context of open systems, world knowledge of vulnerabilities increases daily, as does technology's ability to deliver new safeguards. Gone are the days when the security systems could be introduced, homologated and then forgotten about. In today's world we need to react rapidly to changes in threat, exposure and vulnerability. A "step-function" homologation process is therefore probably insufficient, as upon the completion of the homologation process, changes in threat, exposure and vulnerability may cause the homologation process to start over again. At first view, homologation needs to be a continuous process.
If homologation is a continuous process, then in addition to a SSP and ITSEC (or ITSEC-like criteria) we undoubtedly require mechanisms to audit the actual state of the subject of homologation (e.g. actual network topology) and the state-of-the-art regarding information security attack and defence. The chosen mechanisms would be IT-based and if they were not deployed continuously they would be deployed regularly on a daily, weekly or monthly basis. Such mechanisms might include:
A variety of commercial-off-the-shelf tools exist in support of all of these categories, the most useful addressing two or more categories. A potentially canonical example use to be the "Expert" risk assessment product. This product, which no longer seems to be available, combined network and modem discovery with vulnerability identification and safeguard prioritisation. The utility of this tool was further enhanced by its ability to connect to an on-line, regularly updated library of vulnerabilities and safeguards. This particular feature, categorised it as a management tool, able to assist a service provider or system manager to regularly check compliance with the SSP and take action accordingly. In this sense, homologation becomes a day-to-day management problem. We therefore return to the idea of an Information Security Management System (ISMS) which has the objective of assisting management to carry out this process. Of course, such an ISMS is merely an information quality system with the sole purpose of facilitating informed decisions about the security of that enterprise (the "target enterprise") within the scope of the ISMS. In this sense, it would be sensible to demand homologation for the ISMS. The homologated ISMS then ensures continued homologation for the target enterprise.
We must therefore shift our concept of homologation from the target enterprise, e.g., the traditional IT system as defined by ITSEC, to the ISMS that manages the security of that target enterprise. Fortunately, criteria now exist which facilitate the homologation of an ISMS.
BS7799: 1995 Part 1 was originally conceived as a code of practice for information security management. It catalogues a whole host of good security controls with near universal applicability for multi-national organisations. The 1995 attempt to adopt BS7799 as an ISO standard failed, which perhaps was a good thing as it has focused attention on the importance of the ISMS, as opposed to the technology involved. Two particular developments have taken place:
The Dutch advanced the idea of allowing organisations not only the ability to decide which BS7799 security controls applied to their target enterprise and which did not, but also what additional security controls not covered by BS7799 applied. The UK embraced this "supersetting" idea wholeheartedly in formulating BS7799:1997 Part 2 to address accredited certification against the standard. BS7799 Part 2 requires the creation of an ISMS, rather like ISO 9000 requires the creation of a quality management system. As in the ISO 9000 case, BS7799 certification, certifies the ISMS.
Table 1 compares the mandatory requirements of BS7799:1997 Part 2 with those required for the traditional homologation process practised by the British government (IM5). The compatibility is striking for two reasons:
However, it is BS7799's ability to invoke other standards through the "superset" rule that renders BS7799 of paramount importance as far as homologation is concerned. With reference to Trust Services (for example the provision of basic cryptographic certification services) a combination of PKIX, and BS7799:1995 provides a useful basis for designing a BS7799 certifiable ISMS for a Trust Service Provider. In the context of the "competing enterprise" homologation scenario previously described, this ought to provide the mechanism for asserting adherence to the desired publicly recognised standard.
The Common Criteria's Advantage
But, of course, homologation is not just a management problem: we need the right technical solution to be in place as well as a reliable management system to deploy that solution, monitor its effectiveness and react accordingly.
The Common Criteria (CC) provides criteria for the evaluation of IT security measures that harmonises the European (ITSEC) and North American (US Federal Criteria and the Canadian Criteria) approaches. Its advantage over the ITSEC is threefold:
Like ITSEC, the CC assumes a static world where security products and systems are built and last forever. This is a false world, it is not the world of reality where products and systems evolve rapidly. Thus, if ITSEC dies, so will the CC, the advantages of the CC over ITSEC will not sustain the CC. However a "marriage" of the CC and BS7799 would allow the CC to assume the dynamic world where products and systems are forever changing. The CC's dowry would, in particular, be the extension of its library of functional dependencies into the non-technical areas required by an ISMS. The ability to mix and match assurance criteria and declare generic ISMSs would undoubtedly be advantageous. The "marriage" could consummate, for example, the concept of only using certified SEFs.
The AccredIS Solution
BS7799 and the CC ought to form the basis for the modern day homologation process. AccredIS provides a means for its achievement.
AccredIS is really three things:
The AccredIS process was described at Eurosec 97. At that time we merely hoped that BS7799 certification would produce the required public statement of assurance in support of the modern day homologation process. The inclusion of the "superset" rule in BS7799:1997 - Part 2, makes this hope a reality.
Summary and Conclusions
In this paper we have investigated the evolution and use of homologation criteria. We concluded that with the rapid advances in technology and migration towards a more user-centric culture, the homologation process should move towards certifying the ISMS rather than the target enterprise. An appropriate marriage of emergent Common Criteria and the BS7799 standard goes part way to solving the problem, but real-time security management tools are likely to be a necessary ingredient for success.
|© Gamma Secure Systems Limited, 1998-2003|