Showing posts from October, 2013

From the Editors: The Invisible Computers

[Originally published in the November/December 2011 issue (Volume 9 number 6) of IEEE Security & Privacy magazine.] Just over a decade ago, shortly before we launched IEEE Security & Privacy, MIT Press published Donald Norman 's book The Invisible Computer . At the time, conversations about the book focused on the opportunities exposed by his powerful analogies between computers and small electric motors as system components. Today, almost everything we use has one or more computers, and a surprising number have so many that they require internal networks. For instance, a new automobile has so many computers in it that it has at least two local area networks, separated by a firewall, to connect them, along with interconnects to external systems. There's probably even a computer in the key! Medical device makers have also embraced computers as components. Implantable defibrillators and pacemakers have computers and control APIs. If it's a computer, it must have so

From the Editors: Privacy and the System Life Cycle

[Originally published in the March/April 2011 issue (Volume 9 number 2) of IEEE Security & Privacy magazine.] Engineering long-lived systems is hard, and adding privacy considerations to such systems makes the work harder. Who may look at private data that I put online? Certainly I may look at it, plus any person I explicitly authorize. When may the online system's operators look at it? Certainly when customer service representatives are assisting me in resolving a problem, they might look at the data, though I would expect them to get my permission before doing so. I would also expect my permission to extend only for the duration of the support transaction and to cover just enough data elements to allow the problem's analysis and resolution. When may developers responsible for the software's evolution and maintenance look at my data? Well, pretty much never. The exception is when they're called in during escalation of a customer service transaction. Yes, that'

From the Editors: Phagocytes in Cyberspace

[Originally published in the March/April 2010 issue (Volume 8 number 2) of IEEE Security & Privacy magazine.] Let us reflect on the evolution of malware as our industry has progressed during the 30-plus years since computers moved out of the mainframe datacenter cathedrals and into the personal computer bazaars. We might be moving back to cathedrals these days with the expansion of cloud computing, but the personal computer is here to stay in one form or another -- whether it's desktop or laptop or PDA or smartphone, and whether it's a stand-alone system with fat client software or a network device with thinner clients. In the early days of computing, malware was transmitted by infected floppy disks. Authors were amateurs, virulence was low, and the risk was relatively minorĂ¢€”mostly an inconvenience. Later, the computing universe got larger and more densely connected as PCs became cheaper and the Internet and the Web made distributing software cheaper and easier. The sof

From the Editors: International Blues

[Originally published in the March/April 2010 issue (Volume 8 number 2) of IEEE Security & Privacy magazine.] IEEE Security & Privacy could be a lot more international in its focus and content. Reflecting on its content and tone over the past seven years, it's hard to tell that we think of either privacy or security in a broad international context. There are examples of taking a broader view, but they're more notable as exceptions than as standards. This is bad for several reasons. First, privacy and security have different levels of importance in different places in the world. Second, by largely ignoring the non-Western world, we risk dangerous blind spots. Third, we might be failing to take simple steps that would make our magazine more valuable worldwide. Although the purely technical aspects of our work are universal and generic, engineering is all about making trade-offs informed by economic and cultural judgments. Moreover, our subject matter firmly straddles t

From the Editors: New Models for Old

[Originally published in the July/August 2009 issue (Volume 7 number 4) of IEEE Security & Privacy magazine.] When faced with a new thing, human beings do something very sensible. They try to harness previous experience and intuition in service of the new thing. How is this new thing like something that I already know and understand? Trying to model the new thing on some old thing can be efficient, making it easier to reason about the new thing by using analogies adopted from previous experience. The late Claude Shannon did this at least twice in his illustrious career. The 1930s were an intense time in digital circuits, with engineers busily designing and building ever more complex machines out of electromechanical relays. Design principles for relay systems were vague and imprecise, with engineers employing rules of thumb and heuristics whose efficacy were limited. The result was a world in which tremendous potential was hampered by a real lack of powerful tools for reasoning a

From the Editors: Reading (with) the Enemy

[Originally published in the January/February 2009 issue (Volume 7 number 1) of IEEE Security & Privacy magazine.] Back in the July/August 2006 issue of IEEE Security & Privacy, the editors of the Book Reviews department wrote an essay entitled,  " Why We Won't Review Books by Hackers ."  They argued that to review such books would be to "tacitly endorse a convicted criminal who now wants to pass himself off as a consultant." We published two letters to the editor in the subsequent issue, and that was the end of the topic. Or so you thought. In this issue, I argue that whether S&P reviews them, you should read the writings of bad guys, with the usual caveat that you should do so if they have something useful to say and are well written. This topic has been debated for many years, and the positions boil down to one of four basic arguments: The writings of bad guys are morally tainted. We should not reward bad guys for bad behavior. The writing

From the Editors: Cyberassault on Estonia

[This editorial was published originally in " Security & Privacy " Volume 5 Number 4 July/August 2007 ] Estonia recently survived a massive distributed denial-of-service (DDoS) attack that came on the heels of the Estonian government's relocation of a statue commemorating Russia's 1940s wartime role. This action inflamed the feelings of the substantial Russian population in Estonia, as well as those of various elements in Russia itself. Purple prose then boiled over worldwide, with apocalyptic announcements that a "cyberwar" had been unleashed on the Estonians. Were the attacks initiated by hot-headed nationalists or by a nation state? Accusations and denials have flown, but no nation state has claimed authorship. It's not really difficult to decide if this was cyberwarfare or simple criminality. Current concepts of war require people in uniforms or a public declaration. There's no evidence that such was the case. In addition, there's no reas

From the Editors: Insecurity through Obscurity

[This editorial was published originally in " Security & Privacy " Volume 4 Number 5 September/October 2006 ] Settling on a design for a system of any sort involves finding a workable compromise among functionality, feasibility, and finance. Does it do enough of what the sponsor wants? Can it be implemented using understood and practical techniques? Is the projected cost reasonable when set against the anticipated revenue or savings? In the case of security projects, functionality is generally stated in terms of immunity or resistance to attacks that seek to exploit known vulnerabilities. The first step in deciding whether to fund a security project is to assess whether its benefits outweigh the costs. This is easy to state but hard to achieve. What are the benefits? Some set of exploits will be thwarted. But how likely would they be to occur if we did nothing? And how likely will they be to occur if we implement the proposed remedy? What is the cost incurred per incident

From the Editors: The Impending Debate

[This editorial was published originally in " Security & Privacy " Volume 4 Number 2 March/April 2006 ] There's some scary stuff going on in the US right now. President Bush says that he has the authority to order, without a warrant, eavesdropping on telephone calls and emails from and to people who have been identified as terrorists. The question of whether the president has this authority will be resolved by a vigorous debate among the government's legislative, executive, and judicial branches, accompanied, if history is any guide, by copious quantities of impassioned rhetoric and perhaps even the rending of garments and tearing of hair. This is as it should be. The president's assertion is not very far, in some ways, from Google's claims that although its Gmail product examines users' email for the purpose of presenting to them targeted advertisements, user privacy isn't violated because no natural person will examine your email. The ability of

From the Editors: There Ain't No Inside, There Ain't No Outside ...

[This editorial was published originally in " Security & Privacy " Volume 3 Number 5 September/October 2005 ] There ain't no good guys, there ain't no bad guys, There's only you and me and we just disagree —"We Just Disagree," words and music by Jim Krueger Although Jim Krueger might be right that there are no good guys or bad guys in a romantic disagreement, in computer security, there are definitely good guys and bad guys. What there isn't, however, is an inside or an outside. In the 1990s, before Web browsers emerged, corporations owned or licensed all of the information on a corporate computer display—it was inside. Being inside was a proxy for trusted, whereas being outside meant being untrusted … for everything. Security experts warned that these simple solutions were too coarse, and that the principles of least privilege and separation of concerns should be adopted, but everyone ignored this advice. After the Web became pervasive, however,

From the Editors: What's in a Name?

[This editorial was published originally in " Security & Privacy " Volume 3 Number 2 March/April 2005] "What's in a name? That which we call a rose By any other name would smell as sweet;" —Romeo and Juliet, Act II, Scene ii In ancient times, when the economy was agrarian and people almost never traveled more than a few miles from their places of birth, most people made do with a single personal name. Everyone you met generally knew you, and if there did happen to be two Percivals in town, people learned to distinguish between "tall Percival" and "short Percival." The development of travel and trade increased the number of different people you might meet in a life time and led to more complex names. By the Greek classical period, an individual's name had become a three-part structure including a personal name, a patronymic, and a demotic, which identified the person's deme—roughly, one's village or clan. This represented the e

A Fine Tunnel

Some years ago I went skiing with some Italian friends.  I flew to their home in Pisa and we drove north along the west coast of the country to Sestriere, at the triple juncture of Switzerland, France, and Italy, a lovely ski area. The road we took went through the area of Genoa.  This particular road is a high-speed limited access highway.  Probably the A12 according to modern maps.  In this section the coast is very steep, almost cliffs running down to the compact city of Genoa.  One  section of the road is particularly dramatic, an alternating sequence of bridges and tunnels through the steeply inclined terrain above Genoa. Anyway, as we drove north we went through one of the longer tunnels.  As we were in the middle of this particular tunnel I saw a sign that gave me a peculiar Alice in Wonderland sensation.  The sign, a professionally executed one with all the hallmarks of the highway system, showed an outline of a cup of coffee, a little wisp of steam proceeding from its top.  Wi