From the Editors: New Models for Old

[Originally published in the July/August 2009 issue (Volume 7 number 4) of IEEE Security & Privacy magazine.]

When faced with a new thing, human beings do something very sensible. They try to harness previous experience and intuition in service of the new thing. How is this new thing like something that I already know and understand?

Trying to model the new thing on some old thing can be efficient, making it easier to reason about the new thing by using analogies adopted from previous experience. The late Claude Shannon did this at least twice in his illustrious career.

The 1930s were an intense time in digital circuits, with engineers busily designing and building ever more complex machines out of electromechanical relays. Design principles for relay systems were vague and imprecise, with engineers employing rules of thumb and heuristics whose efficacy were limited. The result was a world in which tremendous potential was hampered by a real lack of powerful tools for reasoning about the artifacts that engineers were creating.

In 1937, Shannon wrote his master's dissertation at MIT entitled, "A Symbolic Analysis of Relay and Switching Circuits." In this paper, which has been called "possibly the most important, and also the most famous, master's thesis of the [twentieth] century," he observed that if one limited the interconnection topology very slightly, one could prove that relay circuits obeyed the mathematical rules George Boole formalized in "An Investigation of the Laws of Thought" in 1854. Suddenly, engineers had in their hands powerful tools to help them analyze designs, predict their performance, and determine whether the designs could be made smaller or simpler. It's because of this work that today we refer to digital circuitry as "logic."

If he had done no more in his career, Shannon would have been a major contributor, but he couldn't leave well enough alone. In 1948, he released "A Mathematical Theory of Communication," a paper that established the field of information theory. The basic concept introduced was that information could be modelled effectively using the mathematics of probability theory, particularly using the specific notations common to thermodynamics. The importance of the information theory work was so great that his earlier work on digital circuit theory has faded to comparative unimportance.

The ability to reuse a model when it fits, even if only approximately, is a powerful tool for speeding the adoption of new technologies. The desktop metaphor is credited with helping the Macintosh rapidly reach a user community that had previously found computing inaccessible, becoming the common metaphor across essentially all computing environments. Although the metaphor has its roots in the work of Douglas Englebart and was refined at Xerox PARC, it's forever associated with the Macintosh.

Analogic and metaphoric reasoning doesn't always work, however. For each of the brilliant examples cited here, there's at least one counterexample in which such approaches fail

Some years ago, I led a project at an investment bank to replace its use of microfiche with an online system. In designing the system, we referred to some SEC regulations governing the storage and retention of records by institutions such as ours. The regulations specified that only optical disks were permitted in these record retention systems. The provision of the regulation that gave the engineers working on the design effort the most entertainment is the requirement that they provide a facility for "projecting" images of the stored documents. It was clear from the rule's wording that the document's authors had a mental model in which an optical disk was very much like microfiche, containing very highly miniaturized photographic images of the documents stored there. In a microfiche system, a document is optically enlarged using what amounts to a slide projector. The intent of the regulation was obviously not that we provide a facility to project retrieved documents on a screen but rather that our system be able to display an essentially unaltered rendition of the original document, allowing investigators to see such documents as they were seen by the bank's staff when they were first used.

As the system's designers, we felt compelled to write an extensive interpretive document that extracted the original intent from the regulations and get the lawyers to sign off on that interpretation. Then, we could ensure that each of those, more appropriately posed, requirements was met and document how that had been done. In this case, we'd inverted the overly specific regulation to get at the true underlying functional requirements. Of course, if the requirements had been written properly to start with, we could have avoided the time-consuming and expensive process of writing the interpretative document and getting it reviewed and approved by the compliance department. Moreover, we would have avoided the risk that the SEC might disagree with our interpretation and restatement of the requirements.

Why is this important? As technical professionals, we often bemoan the challenge of communicating technology's potential to laypeople and of their often painful errors in attempting to pierce the complexities and grasp the essential concepts and values on offer. This challenge is manifested in rules and regulations written to "fight the last war" and interpreted by auditors, reporters, and analysts who sometimes miss the essential point. Our frustration is that it's often these laymen, rather than our technical leaders and visionaries, who establish public understanding of our contributions.

As an industry, we're now faced with a wide range of circumstances in which the security and privacy protection provisions of systems are specified in laws and regulations. For instance, we have regulations like SEC rules, HIPAA, and SOX that enshrine paper-based information storage and retrieval models in their security and control models. If you have a paper record, how do you ensure its immunity from destruction, theft, or alteration? Why, you put it in a room with thick walls and strong locked doors. You check the backgrounds of everyone requesting access to the room, including the executives and the janitors. You implement careful processes to ensure that every transaction involving one of the documents is recorded in a log book somewhere.

Unfortunately, when you replace the file cabinets in the room with racks full of disks connected by networks, you discover that the thick walls are now as effective as a similar volume of air at securing the documents. But a literal audit might well give a clean bill of health to the roomful of disks. It's secured within a strong wall. The doors are locked. Everyone with access to the keys is known. A+.

What can we, the security and privacy technical community, do to improve things? Rules and regulations are unfortunately -static documents that, in a dynamic technology world, will somehow always manage to find themselves out of date. We're in the midst of a huge society-wide change to move record keeping from paper systems to digital ones. In consequence, a vast number of existing rules can and should be rethought and revised. No better time than now, and no one better to do it than we.

Comments

Popular posts from this blog

Quora Greatest Hits - What are common stages that PhD student researchers go through with their thesis project?

HP 35 calculator 200 trick