Information Security - an Oxymoron for the information age
“Always the beautiful answer who asks a more beautiful question.” e. e. cummings
...or why i am with Gelernter
The premise of this mashup is to examine the paper by Saltzer and Schroeder which was written in 1975 and serves as the basis for most information security programs against the Gelernter's manifesto as to where computing is actually going. Each of the eight principles in Saltzer and Schroeder's paper is listed in order, and followed by select excerpts of Gelernter's manifesto. This comparison is to examine theoretical information security principles vis a vis the actual utility of modern information systems. I will not make an attempt to reconcile theory and practice, but will point out where the two schools of thought agree. In fairness, Saltzer and Schroeder's paper was written 25 years before Gelernter's, however Saltzer and Schroeder's principles dominate the thinking about information security to this day and so its important to view them side by side with Gelernter's thinking on the direction of computing.
Saltzer and Schroeder:
"a) Economy of mechanism: Keep the design as simple and small as possible. This well-known principle applies to any aspect of a system, but it deserves emphasis for protection mechanisms for this reason: design and implementation errors that result in unwanted access paths will not be noticed during normal use (since normal use usually does not include attempts to exercise improper access paths). As a result, techniques such as line-by-line inspection of software and physical examination of hardware that implements protection mechanisms are necessary. For such techniques to be successful, a small and simple design is essential."
Gelernter:
"9. The computing future is based on "cyberbodies" — self-contained, neatly-ordered, beautifully-laid-out collections of information, like immaculate giant gardens."
Conclusion(gp): So far, so good
**
Saltzer and Schroeder:
"b) Fail-safe defaults: Base access decisions on permission rather than exclusion. This principle, suggested by E. Glaser in 1965,8 means that the default situation is lack of access, and the protection scheme identifies conditions under which access is permitted. The alternative, in which mechanisms attempt to identify conditions under which access should be refused, presents the wrong psychological base for secure system design. A conservative design must be based on arguments why objects should be accessible, rather than why they should not. In a large system some objects will be inadequately considered, so a default of lack of permission is safer. A design or implementation mistake in a mechanism that gives explicit permission tends to fail by refusing permission, a safe situation, since it will be quickly detected. On the other hand, a design or implementation mistake in a mechanism that explicitly excludes access tends to fail by allowing access, a failure which may go unnoticed in normal use. This principle applies both to the outward appearance of the protection mechanism and to its underlying implementation."
Conclusion(gp): A conservative design principle that puts the object's owner in control of permissions. This makes a lot of sense from the object point of view, but does little to address the use case in which it executes.
**
Saltzer and Schroeder:
"c) Complete mediation: Every access to every object must be checked for authority. This principle, when systematically applied, is the primary underpinning of the protection system. It forces a system-wide view of access control, which in addition to normal operation includes initialization, recovery, shutdown, and maintenance. It implies that a foolproof method of identifying the source of every request must be devised. It also requires that proposals to gain performance by remembering the result of an authority check be examined skeptically. If a change in authority occurs, such remembered results must be systematically updated."
Gelernter:
"8. The software systems we depend on most today are operating systems (Unix, the Macintosh OS, Windows et. al.) and browsers (Internet Explorer, Netscape Communicator...). Operating systems are connectors that fasten users to computers; they attach to the computer at one end, the user at the other. Browsers fasten users to remote computers, to "servers" on the internet.
Today's operating systems and browsers are obsolete because people no longer want to be connected to computers — near ones OR remote ones. (They probably never did). They want to be connected to information. In the future, people are connected to cyberbodies; cyberbodies drift in the computational cosmos — also known as the Swarm, the Cybersphere.
13. Any well-designed next-generation electronic gadget will come with a ``Disable Omniscience'' button.
17. A cyberbody can be replicated or distributed over many computers; can inhabit many computers at the same time. If the Cybersphere's computers are tiles in a paved courtyard, a cyberbody is a cloud's drifting shadow covering many tiles simultaneously.
20. If a million people use a Web site simultaneously, doesn't that mean that we must have a heavy-duty remote server to keep them all happy? No; we could move the site onto a million desktops and use the internet for coordination. The "site" is like a military unit in the field, the general moving with his troops (or like a hockey team in constant swarming motion). (We used essentially this technique to build the first tuple space implementations. They seemed to depend on a shared server, but the server was an illusion; there was no server, just a swarm of clients.) Could Amazon.com be an itinerant horde instead of a fixed Central Command Post? Yes."
Conclusion(gp): Complete mediation provides the underpinning for Saltzer and Schroeder's system, but does not appear to scale to the desired itinerant horde at least in common interpretation.
**
Saltzer and Schroeder:
"d) Open design: The design should not be secret. The mechanisms should not depend on the ignorance of potential attackers, but rather on the possession of specific, more easily protected, keys or passwords. This decoupling of protection mechanisms from protection keys permits the mechanisms to be examined by many reviewers without concern that the review may itself compromise the safeguards. In addition, any skeptical user may be allowed to convince himself that the system he is about to use is adequate for his purpose. Finally, it is simply not realistic to attempt to maintain secrecy for any system which receives wide distribution."
Conclusion(gp): both seem to agree, hard to get the itinerant horde moving in a swarm without open standards.
**
Saltzer and Schroeder:
"e) Separation of privilege: Where feasible, a protection mechanism that requires two keys to unlock it is more robust and flexible than one that allows access to the presenter of only a single key. The relevance of this observation to computer systems was pointed out by R. Needham in 1973. The reason is that, once the mechanism is locked, the two keys can be physically separated and distinct programs, organizations, or individuals made responsible for them. From then on, no single accident, deception, or breach of trust is sufficient to compromise the protected information. This principle is often used in bank safe-deposit boxes. It is also at work in the defense system that fires a nuclear weapon only if two different people both give the correct command. In a computer system, separated keys apply to any situation in which two or more conditions must be met before access should be permitted. For example, systems providing user-extendible protected data types usually depend on separation of privilege for their implementation."
Gelernter:
"37. Elements stored in a mind do not have names and are not organized into folders; are retrieved not by name or folder but by contents. (Hear a voice, think of a face: you've retrieved a memory that contains the voice as one component.) You can see everything in your memory from the standpoint of past, present and future. Using a file cabinet, you classify information when you put it in; minds classify information when it is taken out. (Yesterday afternoon at four you stood with Natasha on Fifth Avenue in the rain — as you might recall when you are thinking about "Fifth Avenue," "rain," "Natasha" or many other things. But you attached no such labels to the memory when you acquired it. The classification happened retrospectively.)"
Conclusion(gp): Information Security models tend to look at things statically through information classification lenses, but its how information is used that makes it valuable. In practice this is how information security theory breaks down in the face of reality - what does an access control matrix look like for a mashup? What does it look like for a data mining app?
**
Saltzer and Schroeder:
"f) Least privilege: Every program and every user of the system should operate using the least set of privileges necessary to complete the job. Primarily, this principle limits the damage that can result from an accident or error. It also reduces the number of potential interactions among privileged programs to the minimum for correct operation, so that unintentional, unwanted, or improper uses of privilege are less likely to occur. Thus, if a question arises related to misuse of a privilege, the number of programs that must be audited is minimized. Put another way, if a mechanism can provide "firewalls," the principle of least privilege provides a rationale for where to install the firewalls. The military security rule of "need-to-know" is an example of this principle."
Gelernter:
"28. Metaphors have a profound effect on computing: the file-cabinet metaphor traps us in a "passive" instead of "active" view of information management that is fundamentally wrong for computers.
29. The rigid file and directory system you are stuck with on your Mac or PC was designed by programmers for programmers — and is still a good system for programmers. It is no good for non-programmers. It never was, and was never intended to be.
30. If you have three pet dogs, give them names. If you have 10,000 head of cattle, don't bother. Nowadays the idea of giving a name to every file on your computer is ridiculous."
Conclusion(gp): Least Privilege is the point where the practical matter of applying Saltzer and Schroeder's principles breaks down in modern systems. Its a deployment issue, and a matter of insufficient models and modes.
**
Saltzer and Schroeder:
"g) Least common mechanism: Minimize the amount of mechanism common to more than one user and depended on by all users [28]. Every shared mechanism (especially one involving shared variables) represents a potential information path between users and must be designed with great care to be sure it does not unintentionally compromise security. Further, any mechanism serving all users must be certified to the satisfaction of every user, a job presumably harder than satisfying only one or a few users. For example, given the choice of implementing a new function as a supervisor procedure shared by all users or as a library procedure that can be handled as though it were the user's own, choose the latter course. Then, if one or a few users are not satisfied with the level of certification of the function, they can provide a substitute or not use it at all. Either way, they can avoid being harmed by a mistake in it."
Gelernter:
"6. Miniaturization was the big theme in the first age of computers: rising power, falling prices, computers for everybody. Theme of the Second Age now approaching: computing transcends computers. Information travels through a sea of anonymous, interchangeable computers like a breeze through tall grass. A dekstop computer is a scooped-out hole in the beach where information from the Cybersphere wells up like seawater.
16. The future is dense with computers. They will hang around everywhere in lush growths like Spanish moss. They will swarm like locusts. But a swarm is not merely a big crowd. The individuals in the swarm lose their identities. The computers that make up this global swarm will blend together into the seamless substance of the Cybersphere. Within the swarm, individual computers will be as anonymous as molecules of air.
55. Software can solve hard problems in two ways: by algorithm or by making connections — by delivering the problem to exactly the right human problem-solver. The second technique is just as powerful as the first, but so far we have ignored it.
56. Lifestreams and microcosms are the two most important cyberbody types; they relate to each other as a single musical line relates to a single chord. The stream is a "moment in space," the microcosm a moment in time."
**
Saltzer and Schroeder:
"h) Psychological acceptability: It is essential that the human interface be designed for ease of use, so that users routinely and automatically apply the protection mechanisms correctly. Also, to the extent that the user's mental image of his protection goals matches the mechanisms he must use, mistakes will be minimized. If he must translate his image of his protection needs into a radically different specification language, he will make errors."
Gelernter:
"7. "The network is the computer" — yes; but we're less interested in computers all the time. The real topic in astronomy is the cosmos, not telescopes. The real topic in computing is the Cybersphere and the cyberstructures in it, not the computers we use as telescopes and tuners.
27. Modern computing is based on an analogy between computers and file cabinets that is fundamentally wrong and affects nearly every move we make. (We store "files" on disks, write "records," organize files into "folders" — file-cabinet language.) Computers are fundamentally unlike file cabinets because they can take action.
31. Our standard policy on file names has far-reaching consequences: doesn't merely force us to make up names where no name is called for; also imposes strong limits on our handling of an important class of documents — ones that arrive from the outside world. A newly-arrived email message (for example) can't stand on its own as a separate document — can't show up alongside other files in searches, sit by itself on the desktop, be opened or printed independently; it has no name, so it must be buried on arrival inside some existing file (the mail file) that does have a name. The same holds for incoming photos and faxes, Web bookmarks, scanned images...
32. You shouldn't have to put files in directories. The directories should reach out and take them. If a file belongs in six directories, all six should reach out and grab it automatically, simultaneously.
33. A file should be allowed to have no name, one name or many names. Many files should be allowed to share one name. A file should be allowed to be in no directory, one directory, or many directories. Many files should be allowed to share one directory. Of these eight possibilities, only three are legal and the other five are banned — for no good reason.
53. Your car, your school, your company and yourself are all one-track vehicles moving forward through time, and they will each leave a stream-shaped cyberbody (like an aircraft's contrail) behind them as they go. These vapor-trails of crystallized experience will represent our first concrete answer to a hard question: what is a company, a university, any sort of ongoing organization or institution, if its staff and customers and owners can all change, its buildings be bulldozed, its site relocated — what's left? What is it? The answer: a lifestream in cyberspace."
**
Conclusion(gp):
The Saltzer and Schroeder principles of Open Design and Economy of Mechanism hold up well in the face of modern computing realities, and to a certain extent Fail Safe Defaults does as well; however if we information security people are to be effective we need to re-think the other principles.
**
Last word: Gelernter:
We'll know the system is working when a butterfly wanders into the in-box and (a few wingbeats later) flutters out — and in that brief interval the system has transcribed the creature's appearance and analyzed its way of moving, and the real butterfly leaves a shadow-butterfly behind. Some time soon afterward you'll be examining some tedious electronic document and a cyber-butterfly will appear at the bottom left corner of your screen (maybe a Hamearis lucina) and pause there, briefly hiding the text (and showing its neatly-folded rusty-chocolate wings like Victorian paisley, with orange eyespots) — and moments later will have crossed the screen and be gone.